Dec 15 13:53:57 crc systemd[1]: Starting Kubernetes Kubelet... Dec 15 13:53:57 crc restorecon[4681]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:57 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:58 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:58 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:58 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:58 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:58 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:58 crc restorecon[4681]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 15 13:53:58 crc restorecon[4681]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 15 13:53:58 crc kubenswrapper[4794]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 15 13:53:58 crc kubenswrapper[4794]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 15 13:53:58 crc kubenswrapper[4794]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 15 13:53:58 crc kubenswrapper[4794]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 15 13:53:58 crc kubenswrapper[4794]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 15 13:53:58 crc kubenswrapper[4794]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.535077 4794 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545080 4794 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545146 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545156 4794 feature_gate.go:330] unrecognized feature gate: Example Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545165 4794 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545173 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545183 4794 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545191 4794 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545199 4794 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545207 4794 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545215 4794 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545223 4794 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545232 4794 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545240 4794 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545248 4794 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545257 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545265 4794 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545273 4794 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545281 4794 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545293 4794 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545306 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545318 4794 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545330 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545339 4794 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545349 4794 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545358 4794 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545368 4794 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545377 4794 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545385 4794 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545393 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545401 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545409 4794 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545419 4794 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545429 4794 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545437 4794 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545445 4794 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545454 4794 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545462 4794 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545471 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545480 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545487 4794 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545495 4794 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545503 4794 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545511 4794 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545519 4794 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545527 4794 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545535 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545543 4794 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545551 4794 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545559 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545566 4794 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545602 4794 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545611 4794 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545619 4794 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545626 4794 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545634 4794 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545641 4794 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545649 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545656 4794 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545664 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545671 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545679 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545687 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545694 4794 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545702 4794 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545710 4794 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545717 4794 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545725 4794 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545733 4794 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545742 4794 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545753 4794 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.545762 4794 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.545960 4794 flags.go:64] FLAG: --address="0.0.0.0" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.545978 4794 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.545995 4794 flags.go:64] FLAG: --anonymous-auth="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546006 4794 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546017 4794 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546026 4794 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546038 4794 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546048 4794 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546059 4794 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546068 4794 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546078 4794 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546102 4794 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546111 4794 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546120 4794 flags.go:64] FLAG: --cgroup-root="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546129 4794 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546138 4794 flags.go:64] FLAG: --client-ca-file="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546147 4794 flags.go:64] FLAG: --cloud-config="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546155 4794 flags.go:64] FLAG: --cloud-provider="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546165 4794 flags.go:64] FLAG: --cluster-dns="[]" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546177 4794 flags.go:64] FLAG: --cluster-domain="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546185 4794 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546194 4794 flags.go:64] FLAG: --config-dir="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546203 4794 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546213 4794 flags.go:64] FLAG: --container-log-max-files="5" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546223 4794 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546233 4794 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546242 4794 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546251 4794 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546262 4794 flags.go:64] FLAG: --contention-profiling="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546271 4794 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546280 4794 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546291 4794 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546299 4794 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546310 4794 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546320 4794 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546328 4794 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546337 4794 flags.go:64] FLAG: --enable-load-reader="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546347 4794 flags.go:64] FLAG: --enable-server="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546355 4794 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546374 4794 flags.go:64] FLAG: --event-burst="100" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546383 4794 flags.go:64] FLAG: --event-qps="50" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546392 4794 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546401 4794 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546410 4794 flags.go:64] FLAG: --eviction-hard="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546421 4794 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546430 4794 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546438 4794 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546450 4794 flags.go:64] FLAG: --eviction-soft="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546460 4794 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546469 4794 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546479 4794 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546488 4794 flags.go:64] FLAG: --experimental-mounter-path="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546497 4794 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546506 4794 flags.go:64] FLAG: --fail-swap-on="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546514 4794 flags.go:64] FLAG: --feature-gates="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546525 4794 flags.go:64] FLAG: --file-check-frequency="20s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546534 4794 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546544 4794 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546553 4794 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546562 4794 flags.go:64] FLAG: --healthz-port="10248" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546573 4794 flags.go:64] FLAG: --help="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546608 4794 flags.go:64] FLAG: --hostname-override="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546618 4794 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546628 4794 flags.go:64] FLAG: --http-check-frequency="20s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546637 4794 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546646 4794 flags.go:64] FLAG: --image-credential-provider-config="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546655 4794 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546663 4794 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546672 4794 flags.go:64] FLAG: --image-service-endpoint="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546681 4794 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546690 4794 flags.go:64] FLAG: --kube-api-burst="100" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546699 4794 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546708 4794 flags.go:64] FLAG: --kube-api-qps="50" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546717 4794 flags.go:64] FLAG: --kube-reserved="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546726 4794 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546736 4794 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546745 4794 flags.go:64] FLAG: --kubelet-cgroups="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546753 4794 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546762 4794 flags.go:64] FLAG: --lock-file="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546770 4794 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546779 4794 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546789 4794 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546811 4794 flags.go:64] FLAG: --log-json-split-stream="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546822 4794 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546831 4794 flags.go:64] FLAG: --log-text-split-stream="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546840 4794 flags.go:64] FLAG: --logging-format="text" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546849 4794 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546858 4794 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546867 4794 flags.go:64] FLAG: --manifest-url="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546876 4794 flags.go:64] FLAG: --manifest-url-header="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546887 4794 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546897 4794 flags.go:64] FLAG: --max-open-files="1000000" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546908 4794 flags.go:64] FLAG: --max-pods="110" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546917 4794 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546926 4794 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546935 4794 flags.go:64] FLAG: --memory-manager-policy="None" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546944 4794 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546953 4794 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546962 4794 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546971 4794 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.546995 4794 flags.go:64] FLAG: --node-status-max-images="50" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547004 4794 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547013 4794 flags.go:64] FLAG: --oom-score-adj="-999" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547022 4794 flags.go:64] FLAG: --pod-cidr="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547030 4794 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547043 4794 flags.go:64] FLAG: --pod-manifest-path="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547052 4794 flags.go:64] FLAG: --pod-max-pids="-1" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547061 4794 flags.go:64] FLAG: --pods-per-core="0" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547070 4794 flags.go:64] FLAG: --port="10250" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547079 4794 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547087 4794 flags.go:64] FLAG: --provider-id="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547096 4794 flags.go:64] FLAG: --qos-reserved="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547105 4794 flags.go:64] FLAG: --read-only-port="10255" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547114 4794 flags.go:64] FLAG: --register-node="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547123 4794 flags.go:64] FLAG: --register-schedulable="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547131 4794 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547146 4794 flags.go:64] FLAG: --registry-burst="10" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547155 4794 flags.go:64] FLAG: --registry-qps="5" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547164 4794 flags.go:64] FLAG: --reserved-cpus="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547174 4794 flags.go:64] FLAG: --reserved-memory="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547185 4794 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547194 4794 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547204 4794 flags.go:64] FLAG: --rotate-certificates="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547212 4794 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547221 4794 flags.go:64] FLAG: --runonce="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547230 4794 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547239 4794 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547249 4794 flags.go:64] FLAG: --seccomp-default="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547258 4794 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547266 4794 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547275 4794 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547284 4794 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547294 4794 flags.go:64] FLAG: --storage-driver-password="root" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547303 4794 flags.go:64] FLAG: --storage-driver-secure="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547312 4794 flags.go:64] FLAG: --storage-driver-table="stats" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547321 4794 flags.go:64] FLAG: --storage-driver-user="root" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547329 4794 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547339 4794 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547348 4794 flags.go:64] FLAG: --system-cgroups="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547357 4794 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547372 4794 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547381 4794 flags.go:64] FLAG: --tls-cert-file="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547389 4794 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547407 4794 flags.go:64] FLAG: --tls-min-version="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547416 4794 flags.go:64] FLAG: --tls-private-key-file="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547425 4794 flags.go:64] FLAG: --topology-manager-policy="none" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547465 4794 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547475 4794 flags.go:64] FLAG: --topology-manager-scope="container" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547485 4794 flags.go:64] FLAG: --v="2" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547496 4794 flags.go:64] FLAG: --version="false" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547508 4794 flags.go:64] FLAG: --vmodule="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547518 4794 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.547528 4794 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547793 4794 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547805 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547814 4794 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547826 4794 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547836 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547846 4794 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547854 4794 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547863 4794 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547871 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547879 4794 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547887 4794 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547895 4794 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547902 4794 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547910 4794 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547917 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547925 4794 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547933 4794 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547940 4794 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547947 4794 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547955 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547962 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547970 4794 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547978 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547985 4794 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.547993 4794 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548006 4794 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548014 4794 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548022 4794 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548030 4794 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548038 4794 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548046 4794 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548056 4794 feature_gate.go:330] unrecognized feature gate: Example Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548065 4794 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548073 4794 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548081 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548088 4794 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548096 4794 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548104 4794 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548112 4794 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548121 4794 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548128 4794 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548136 4794 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548144 4794 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548151 4794 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548159 4794 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548169 4794 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548180 4794 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548191 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548199 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548207 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548215 4794 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548223 4794 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548231 4794 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548239 4794 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548246 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548254 4794 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548261 4794 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548272 4794 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548283 4794 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548292 4794 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548300 4794 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548308 4794 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548316 4794 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548328 4794 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548335 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548343 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548351 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548358 4794 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548366 4794 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548374 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.548381 4794 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.548658 4794 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.562273 4794 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.562334 4794 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562486 4794 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562501 4794 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562510 4794 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562521 4794 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562530 4794 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562539 4794 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562547 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562555 4794 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562564 4794 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562573 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562612 4794 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562621 4794 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562629 4794 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562638 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562647 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562656 4794 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562666 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562674 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562685 4794 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562694 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562703 4794 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562713 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562722 4794 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562731 4794 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562743 4794 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562755 4794 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562765 4794 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562775 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562784 4794 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562793 4794 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562801 4794 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562810 4794 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562819 4794 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562829 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562841 4794 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562851 4794 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562859 4794 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562868 4794 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562876 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562889 4794 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562900 4794 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562910 4794 feature_gate.go:330] unrecognized feature gate: Example Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562921 4794 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562930 4794 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562939 4794 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562951 4794 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562961 4794 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562971 4794 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562981 4794 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562990 4794 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.562999 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563008 4794 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563017 4794 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563029 4794 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563039 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563048 4794 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563057 4794 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563066 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563075 4794 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563084 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563093 4794 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563101 4794 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563110 4794 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563119 4794 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563127 4794 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563135 4794 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563144 4794 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563152 4794 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563161 4794 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563170 4794 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563187 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.563204 4794 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563500 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563514 4794 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563524 4794 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563533 4794 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563543 4794 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563552 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563561 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563570 4794 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563601 4794 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563611 4794 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563621 4794 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563630 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563640 4794 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563649 4794 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563658 4794 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563666 4794 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563675 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563684 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563693 4794 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563703 4794 feature_gate.go:330] unrecognized feature gate: Example Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563712 4794 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563720 4794 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563728 4794 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563737 4794 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563746 4794 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563755 4794 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563763 4794 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563772 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563780 4794 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563792 4794 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563804 4794 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563813 4794 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563824 4794 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563834 4794 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563845 4794 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563854 4794 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563865 4794 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563874 4794 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563883 4794 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563894 4794 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563905 4794 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563914 4794 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563924 4794 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563932 4794 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563941 4794 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563950 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563959 4794 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563968 4794 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563977 4794 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563985 4794 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.563994 4794 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564004 4794 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564013 4794 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564021 4794 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564029 4794 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564038 4794 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564048 4794 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564056 4794 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564067 4794 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564078 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564087 4794 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564096 4794 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564106 4794 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564117 4794 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564128 4794 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564138 4794 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564147 4794 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564156 4794 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564165 4794 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564174 4794 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.564184 4794 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.564199 4794 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.564817 4794 server.go:940] "Client rotation is on, will bootstrap in background" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.569708 4794 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.569863 4794 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.570754 4794 server.go:997] "Starting client certificate rotation" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.570794 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.571635 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-29 19:19:04.104850609 +0000 UTC Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.571754 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.587842 4794 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.589435 4794 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.594461 4794 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.608108 4794 log.go:25] "Validated CRI v1 runtime API" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.632633 4794 log.go:25] "Validated CRI v1 image API" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.634916 4794 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.638833 4794 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-15-13-49-39-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.638878 4794 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.666199 4794 manager.go:217] Machine: {Timestamp:2025-12-15 13:53:58.663950751 +0000 UTC m=+0.515973279 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2e6b0193-6ba1-4635-a26c-e50e20b7171c BootID:134a30f2-e02c-4026-a8fd-915d12b3ae90 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:eb:83:00 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:eb:83:00 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d6:1a:12 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:66:1b:f6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a9:6e:ff Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:3c:bc:54 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:76:56:86:7b:c8:d5 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:02:83:b7:90:1d:6b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.666629 4794 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.666979 4794 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.668492 4794 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.668833 4794 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.668901 4794 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.669253 4794 topology_manager.go:138] "Creating topology manager with none policy" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.669272 4794 container_manager_linux.go:303] "Creating device plugin manager" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.669546 4794 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.669633 4794 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.670049 4794 state_mem.go:36] "Initialized new in-memory state store" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.670202 4794 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.671164 4794 kubelet.go:418] "Attempting to sync node with API server" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.671199 4794 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.671248 4794 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.671271 4794 kubelet.go:324] "Adding apiserver pod source" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.671290 4794 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.674616 4794 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.675132 4794 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.676672 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.676704 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.676793 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.676798 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.676950 4794 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.677754 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.677798 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.677814 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.677828 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.677851 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.677866 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.677880 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.677903 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.677954 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.677970 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.677995 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.678010 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.679348 4794 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.680135 4794 server.go:1280] "Started kubelet" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.680338 4794 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.681261 4794 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.681469 4794 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 15 13:53:58 crc systemd[1]: Started Kubernetes Kubelet. Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.682775 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.682965 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.683024 4794 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.683166 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.683314 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 06:36:43.154333368 +0000 UTC Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.683537 4794 server.go:460] "Adding debug handlers to kubelet server" Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.683565 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.683667 4794 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.683694 4794 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.683895 4794 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.688108 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.688540 4794 factory.go:55] Registering systemd factory Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.688421 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.688571 4794 factory.go:221] Registration of the systemd container factory successfully Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.689095 4794 factory.go:153] Registering CRI-O factory Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.689139 4794 factory.go:221] Registration of the crio container factory successfully Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.688571 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188167f6d17b3198 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-15 13:53:58.6800726 +0000 UTC m=+0.532095078,LastTimestamp:2025-12-15 13:53:58.6800726 +0000 UTC m=+0.532095078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.689361 4794 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.689429 4794 factory.go:103] Registering Raw factory Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.689468 4794 manager.go:1196] Started watching for new ooms in manager Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.693080 4794 manager.go:319] Starting recovery of all containers Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703119 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703561 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703607 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703626 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703643 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703659 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703676 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703697 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703717 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703733 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703751 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703767 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703785 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703805 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703820 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703837 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703866 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703882 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703898 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703916 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703936 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703953 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703970 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.703986 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704002 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704064 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704085 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704105 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704122 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704140 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704156 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704172 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704193 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704209 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704227 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704243 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704259 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704277 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704292 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704308 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704327 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704347 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704366 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704381 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704396 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704412 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.704428 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705006 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705133 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705164 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705202 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705228 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705270 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705309 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705332 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705356 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705380 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705403 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705425 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705457 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705483 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705505 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705526 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705549 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705608 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705631 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705651 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705674 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705696 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705717 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705739 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705780 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705801 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705820 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705846 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705943 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705964 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.705985 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.706008 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707434 4794 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707471 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707515 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707532 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707548 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707565 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707645 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707662 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707677 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707695 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707708 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707729 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707744 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707774 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707804 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707824 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707873 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707893 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707913 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707930 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707948 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707973 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.707993 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708015 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708039 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708056 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708083 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708103 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708131 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708155 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708172 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708188 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708207 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708229 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708245 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708261 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708279 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708295 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708338 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708356 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708385 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708399 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708420 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708438 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708456 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708473 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708577 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708616 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708633 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708669 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708685 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708699 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708714 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708729 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708744 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708778 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708795 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708835 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708876 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708893 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708908 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708957 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708972 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.708986 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709000 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709067 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709081 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709124 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709139 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709152 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709166 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709197 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709213 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709250 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709266 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709281 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709295 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709309 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709324 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709338 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709352 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709387 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709401 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709415 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709467 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709481 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709495 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709539 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709553 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709634 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709657 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709709 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709732 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709746 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709762 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709776 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709791 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709831 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709849 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709871 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709916 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709934 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.709956 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710028 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710042 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710073 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710087 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710109 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710123 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710153 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710267 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710286 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710337 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710418 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710445 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710470 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710490 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710507 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710525 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710664 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710686 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710796 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710818 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710859 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710927 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710955 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.710974 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.711000 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.711017 4794 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.711101 4794 reconstruct.go:97] "Volume reconstruction finished" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.711123 4794 reconciler.go:26] "Reconciler: start to sync state" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.713934 4794 manager.go:324] Recovery completed Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.731409 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.732421 4794 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.734474 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.734517 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.734526 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.735191 4794 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.735222 4794 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.735265 4794 state_mem.go:36] "Initialized new in-memory state store" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.735770 4794 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.735819 4794 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.735850 4794 kubelet.go:2335] "Starting kubelet main sync loop" Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.735905 4794 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 15 13:53:58 crc kubenswrapper[4794]: W1215 13:53:58.737944 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.738031 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.747169 4794 policy_none.go:49] "None policy: Start" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.748286 4794 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.748325 4794 state_mem.go:35] "Initializing new in-memory state store" Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.783244 4794 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.826684 4794 manager.go:334] "Starting Device Plugin manager" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.826816 4794 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.826834 4794 server.go:79] "Starting device plugin registration server" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.827207 4794 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.827227 4794 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.827477 4794 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.827548 4794 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.827559 4794 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.833466 4794 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.836043 4794 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.836516 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.838158 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.838213 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.838227 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.838388 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.838895 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.838949 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.839555 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.839606 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.839618 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.839767 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.839876 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.839904 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.840843 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.840878 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.840890 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.840921 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.840943 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.840954 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.840958 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.841031 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.841049 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.841169 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.841276 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.841312 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.842280 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.842315 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.842323 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.842334 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.842338 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.842369 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.842546 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.842639 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.842670 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.843538 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.843561 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.843571 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.844109 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.844132 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.844149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.844310 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.844341 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.845690 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.845720 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.845735 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.884781 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913234 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913279 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913327 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913402 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913451 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913485 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913510 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913531 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913553 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913572 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913611 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913642 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913684 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913722 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.913744 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.928348 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.929819 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.929873 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.929897 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:58 crc kubenswrapper[4794]: I1215 13:53:58.929939 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 15 13:53:58 crc kubenswrapper[4794]: E1215 13:53:58.930639 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015315 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015412 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015447 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015477 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015518 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015555 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015648 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015686 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015719 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015765 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015770 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015787 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015848 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015810 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015800 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015889 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015749 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015851 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015951 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.015986 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.016021 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.016050 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.016076 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.016102 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.016094 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.016148 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.016120 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.016149 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.016118 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.016278 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.131022 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.132648 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.132703 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.132721 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.132755 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 15 13:53:59 crc kubenswrapper[4794]: E1215 13:53:59.133402 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.189422 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.210386 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.224617 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.241568 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.250777 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 15 13:53:59 crc kubenswrapper[4794]: E1215 13:53:59.286321 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Dec 15 13:53:59 crc kubenswrapper[4794]: W1215 13:53:59.314318 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-081a115c0d91291d427bceec609eb4f53d9ff56ba5c8bd4f1e8294d10dafd73f WatchSource:0}: Error finding container 081a115c0d91291d427bceec609eb4f53d9ff56ba5c8bd4f1e8294d10dafd73f: Status 404 returned error can't find the container with id 081a115c0d91291d427bceec609eb4f53d9ff56ba5c8bd4f1e8294d10dafd73f Dec 15 13:53:59 crc kubenswrapper[4794]: W1215 13:53:59.319515 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2b8c1badf8c3c8744ffb4001d78a9a93f5934d0b3fda4f2b0690e8f7d5d28f34 WatchSource:0}: Error finding container 2b8c1badf8c3c8744ffb4001d78a9a93f5934d0b3fda4f2b0690e8f7d5d28f34: Status 404 returned error can't find the container with id 2b8c1badf8c3c8744ffb4001d78a9a93f5934d0b3fda4f2b0690e8f7d5d28f34 Dec 15 13:53:59 crc kubenswrapper[4794]: W1215 13:53:59.324647 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-c8e57832a019e43d7edba2166807a7b9f63c08a098cc0bff592ed4782c37d50e WatchSource:0}: Error finding container c8e57832a019e43d7edba2166807a7b9f63c08a098cc0bff592ed4782c37d50e: Status 404 returned error can't find the container with id c8e57832a019e43d7edba2166807a7b9f63c08a098cc0bff592ed4782c37d50e Dec 15 13:53:59 crc kubenswrapper[4794]: W1215 13:53:59.325118 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4fe072594ffdf3e5c090a78aa11f7ac90e4696e5958b4dd2cdad3366a72adba9 WatchSource:0}: Error finding container 4fe072594ffdf3e5c090a78aa11f7ac90e4696e5958b4dd2cdad3366a72adba9: Status 404 returned error can't find the container with id 4fe072594ffdf3e5c090a78aa11f7ac90e4696e5958b4dd2cdad3366a72adba9 Dec 15 13:53:59 crc kubenswrapper[4794]: W1215 13:53:59.327379 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-1e98e40c7c8ee2defdf41c8229626bfe9ec2a8314bd9b5e51fb7893117c0abf6 WatchSource:0}: Error finding container 1e98e40c7c8ee2defdf41c8229626bfe9ec2a8314bd9b5e51fb7893117c0abf6: Status 404 returned error can't find the container with id 1e98e40c7c8ee2defdf41c8229626bfe9ec2a8314bd9b5e51fb7893117c0abf6 Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.533763 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.536101 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.536154 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.536172 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.536212 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 15 13:53:59 crc kubenswrapper[4794]: E1215 13:53:59.536717 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Dec 15 13:53:59 crc kubenswrapper[4794]: W1215 13:53:59.682205 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:53:59 crc kubenswrapper[4794]: E1215 13:53:59.682318 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.683454 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 14:10:17.480213622 +0000 UTC Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.683832 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.740188 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c8e57832a019e43d7edba2166807a7b9f63c08a098cc0bff592ed4782c37d50e"} Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.741532 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2b8c1badf8c3c8744ffb4001d78a9a93f5934d0b3fda4f2b0690e8f7d5d28f34"} Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.742728 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"081a115c0d91291d427bceec609eb4f53d9ff56ba5c8bd4f1e8294d10dafd73f"} Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.743993 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1e98e40c7c8ee2defdf41c8229626bfe9ec2a8314bd9b5e51fb7893117c0abf6"} Dec 15 13:53:59 crc kubenswrapper[4794]: I1215 13:53:59.746069 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4fe072594ffdf3e5c090a78aa11f7ac90e4696e5958b4dd2cdad3366a72adba9"} Dec 15 13:53:59 crc kubenswrapper[4794]: W1215 13:53:59.821282 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:53:59 crc kubenswrapper[4794]: E1215 13:53:59.821394 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 15 13:54:00 crc kubenswrapper[4794]: E1215 13:54:00.088135 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Dec 15 13:54:00 crc kubenswrapper[4794]: W1215 13:54:00.099832 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:54:00 crc kubenswrapper[4794]: E1215 13:54:00.099947 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 15 13:54:00 crc kubenswrapper[4794]: W1215 13:54:00.154407 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:54:00 crc kubenswrapper[4794]: E1215 13:54:00.154507 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.337414 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.338759 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.338808 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.338822 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.338850 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 15 13:54:00 crc kubenswrapper[4794]: E1215 13:54:00.339315 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.607317 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 15 13:54:00 crc kubenswrapper[4794]: E1215 13:54:00.608398 4794 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.683656 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 15:01:55.432636167 +0000 UTC Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.683783 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.755975 4794 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f0f757a06f4652e693a1d9f6990b68e4858ab7934a5ca04475e98d32a30c10da" exitCode=0 Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.756058 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f0f757a06f4652e693a1d9f6990b68e4858ab7934a5ca04475e98d32a30c10da"} Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.756090 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.757328 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.757351 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.757359 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.757905 4794 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e" exitCode=0 Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.757957 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e"} Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.758035 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.758696 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.758722 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.758731 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.760853 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d"} Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.761011 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a"} Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.761029 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9"} Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.763402 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042" exitCode=0 Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.763515 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.763487 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042"} Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.764473 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.764526 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.764537 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.766146 4794 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418" exitCode=0 Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.766184 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418"} Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.766219 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.767171 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.767186 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.767194 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.768677 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.770451 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.770469 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:00 crc kubenswrapper[4794]: I1215 13:54:00.770477 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.683849 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 15:17:36.217874795 +0000 UTC Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.684450 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:54:01 crc kubenswrapper[4794]: E1215 13:54:01.688740 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.771609 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7d94910a84079baf96b67a0d8d90c33cb2d524a83951541f58c2b2c01aba5d1a"} Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.771725 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.773271 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.773309 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.773321 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.774713 4794 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="69ef63a985f295c4df2876df8c2b0476619c7b3ba98cc75730fc83fdc334ec2b" exitCode=0 Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.774824 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"69ef63a985f295c4df2876df8c2b0476619c7b3ba98cc75730fc83fdc334ec2b"} Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.774831 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.775871 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.775901 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.775913 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.778149 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04"} Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.778207 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40"} Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.778239 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d"} Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.778355 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.780815 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.780869 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.780887 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.781453 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.781386 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc"} Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.782542 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.782599 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.782619 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.784950 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02"} Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.784979 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28"} Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.784996 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e"} Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.785009 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37"} Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.785020 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f"} Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.785025 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.785733 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.785764 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.785776 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.939798 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.940737 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.940763 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.940773 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:01 crc kubenswrapper[4794]: I1215 13:54:01.940792 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.684269 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 10:14:35.252564219 +0000 UTC Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.684945 4794 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 788h20m32.567623097s for next certificate rotation Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.790403 4794 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a7b8ff07a882051dbc0e0bfd320e7b28efeb622cd125ac94f6c866e958e1fb25" exitCode=0 Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.790482 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.790501 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.790526 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.790530 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.790557 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.790738 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.790804 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a7b8ff07a882051dbc0e0bfd320e7b28efeb622cd125ac94f6c866e958e1fb25"} Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.790891 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791663 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791698 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791710 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791765 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791790 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791841 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791852 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791861 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791929 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791938 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791946 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.791710 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.792216 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:02 crc kubenswrapper[4794]: I1215 13:54:02.792227 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:03 crc kubenswrapper[4794]: I1215 13:54:03.516680 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:54:03 crc kubenswrapper[4794]: I1215 13:54:03.796556 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"43b1e9505ff1fd9a625c8ae3e46ed58d51879d19a588bd1a639ac66960654863"} Dec 15 13:54:03 crc kubenswrapper[4794]: I1215 13:54:03.796619 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a682b7f02cbdb7ad5fabccb299baa31be6a0942fb10248ac3595e4798d3e1810"} Dec 15 13:54:03 crc kubenswrapper[4794]: I1215 13:54:03.796636 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e33a44d9f6498853ab4a8fd59d3bc6a9792986cb5c403a7c10d78d25391b5aa0"} Dec 15 13:54:03 crc kubenswrapper[4794]: I1215 13:54:03.796649 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 13:54:03 crc kubenswrapper[4794]: I1215 13:54:03.796700 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:03 crc kubenswrapper[4794]: I1215 13:54:03.797594 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:03 crc kubenswrapper[4794]: I1215 13:54:03.797631 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:03 crc kubenswrapper[4794]: I1215 13:54:03.797643 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.450934 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.451128 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.455238 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.455309 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.455330 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.632739 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.802725 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4ebf2b1485a58b9f0a793b913a28b8ec77cd948921ede69683d2d8d4c92b6904"} Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.802769 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6bf61e2f0ed68373f32c905b5a350081a8761b20f411228c65e97c95334d1f84"} Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.802888 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.804011 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.804057 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.804070 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.854142 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.854442 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.854532 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.856150 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.856202 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:04 crc kubenswrapper[4794]: I1215 13:54:04.856220 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:05 crc kubenswrapper[4794]: I1215 13:54:05.346199 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:54:05 crc kubenswrapper[4794]: I1215 13:54:05.805338 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:05 crc kubenswrapper[4794]: I1215 13:54:05.805356 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:05 crc kubenswrapper[4794]: I1215 13:54:05.806967 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:05 crc kubenswrapper[4794]: I1215 13:54:05.807012 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:05 crc kubenswrapper[4794]: I1215 13:54:05.807024 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:05 crc kubenswrapper[4794]: I1215 13:54:05.807043 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:05 crc kubenswrapper[4794]: I1215 13:54:05.807083 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:05 crc kubenswrapper[4794]: I1215 13:54:05.807099 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:05 crc kubenswrapper[4794]: I1215 13:54:05.870387 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 15 13:54:06 crc kubenswrapper[4794]: I1215 13:54:06.807705 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:06 crc kubenswrapper[4794]: I1215 13:54:06.808738 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:06 crc kubenswrapper[4794]: I1215 13:54:06.808804 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:06 crc kubenswrapper[4794]: I1215 13:54:06.808822 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:07 crc kubenswrapper[4794]: I1215 13:54:07.013147 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:54:07 crc kubenswrapper[4794]: I1215 13:54:07.013292 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:07 crc kubenswrapper[4794]: I1215 13:54:07.014430 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:07 crc kubenswrapper[4794]: I1215 13:54:07.014462 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:07 crc kubenswrapper[4794]: I1215 13:54:07.014472 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:08 crc kubenswrapper[4794]: E1215 13:54:08.833636 4794 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 15 13:54:08 crc kubenswrapper[4794]: I1215 13:54:08.939019 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:54:08 crc kubenswrapper[4794]: I1215 13:54:08.939296 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:08 crc kubenswrapper[4794]: I1215 13:54:08.941113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:08 crc kubenswrapper[4794]: I1215 13:54:08.941172 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:08 crc kubenswrapper[4794]: I1215 13:54:08.941195 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:09 crc kubenswrapper[4794]: I1215 13:54:09.052915 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:54:09 crc kubenswrapper[4794]: I1215 13:54:09.059951 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:54:09 crc kubenswrapper[4794]: I1215 13:54:09.816045 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:09 crc kubenswrapper[4794]: I1215 13:54:09.817099 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:09 crc kubenswrapper[4794]: I1215 13:54:09.817179 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:09 crc kubenswrapper[4794]: I1215 13:54:09.817204 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:10 crc kubenswrapper[4794]: I1215 13:54:10.013448 4794 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 15 13:54:10 crc kubenswrapper[4794]: I1215 13:54:10.013537 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 15 13:54:10 crc kubenswrapper[4794]: I1215 13:54:10.819197 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:10 crc kubenswrapper[4794]: I1215 13:54:10.820476 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:10 crc kubenswrapper[4794]: I1215 13:54:10.820544 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:10 crc kubenswrapper[4794]: I1215 13:54:10.820562 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:10 crc kubenswrapper[4794]: I1215 13:54:10.826006 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:54:11 crc kubenswrapper[4794]: I1215 13:54:11.014462 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:54:11 crc kubenswrapper[4794]: W1215 13:54:11.729771 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 15 13:54:11 crc kubenswrapper[4794]: I1215 13:54:11.729941 4794 trace.go:236] Trace[1506487295]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (15-Dec-2025 13:54:01.728) (total time: 10001ms): Dec 15 13:54:11 crc kubenswrapper[4794]: Trace[1506487295]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:54:11.729) Dec 15 13:54:11 crc kubenswrapper[4794]: Trace[1506487295]: [10.001393374s] [10.001393374s] END Dec 15 13:54:11 crc kubenswrapper[4794]: E1215 13:54:11.729977 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 15 13:54:11 crc kubenswrapper[4794]: I1215 13:54:11.822092 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:11 crc kubenswrapper[4794]: I1215 13:54:11.823549 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:11 crc kubenswrapper[4794]: I1215 13:54:11.823635 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:11 crc kubenswrapper[4794]: I1215 13:54:11.823653 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:11 crc kubenswrapper[4794]: E1215 13:54:11.941761 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 15 13:54:12 crc kubenswrapper[4794]: W1215 13:54:12.108999 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.109194 4794 trace.go:236] Trace[1944034450]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (15-Dec-2025 13:54:02.107) (total time: 10001ms): Dec 15 13:54:12 crc kubenswrapper[4794]: Trace[1944034450]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:54:12.108) Dec 15 13:54:12 crc kubenswrapper[4794]: Trace[1944034450]: [10.001859607s] [10.001859607s] END Dec 15 13:54:12 crc kubenswrapper[4794]: E1215 13:54:12.109240 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 15 13:54:12 crc kubenswrapper[4794]: E1215 13:54:12.308487 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.188167f6d17b3198 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-15 13:53:58.6800726 +0000 UTC m=+0.532095078,LastTimestamp:2025-12-15 13:53:58.6800726 +0000 UTC m=+0.532095078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 15 13:54:12 crc kubenswrapper[4794]: W1215 13:54:12.640076 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.640166 4794 trace.go:236] Trace[341664118]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (15-Dec-2025 13:54:02.638) (total time: 10001ms): Dec 15 13:54:12 crc kubenswrapper[4794]: Trace[341664118]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:54:12.640) Dec 15 13:54:12 crc kubenswrapper[4794]: Trace[341664118]: [10.001590624s] [10.001590624s] END Dec 15 13:54:12 crc kubenswrapper[4794]: E1215 13:54:12.640191 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 15 13:54:12 crc kubenswrapper[4794]: W1215 13:54:12.643954 4794 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.644078 4794 trace.go:236] Trace[1987765640]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (15-Dec-2025 13:54:02.641) (total time: 10002ms): Dec 15 13:54:12 crc kubenswrapper[4794]: Trace[1987765640]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (13:54:12.643) Dec 15 13:54:12 crc kubenswrapper[4794]: Trace[1987765640]: [10.002286756s] [10.002286756s] END Dec 15 13:54:12 crc kubenswrapper[4794]: E1215 13:54:12.644110 4794 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.685281 4794 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.824950 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.825813 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.825847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.825858 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.847351 4794 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.847410 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.861670 4794 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 15 13:54:12 crc kubenswrapper[4794]: I1215 13:54:12.861732 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 15 13:54:13 crc kubenswrapper[4794]: I1215 13:54:13.116141 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 15 13:54:13 crc kubenswrapper[4794]: I1215 13:54:13.116465 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:13 crc kubenswrapper[4794]: I1215 13:54:13.117834 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:13 crc kubenswrapper[4794]: I1215 13:54:13.117914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:13 crc kubenswrapper[4794]: I1215 13:54:13.117926 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:13 crc kubenswrapper[4794]: I1215 13:54:13.142275 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 15 13:54:13 crc kubenswrapper[4794]: I1215 13:54:13.827492 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:13 crc kubenswrapper[4794]: I1215 13:54:13.828682 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:13 crc kubenswrapper[4794]: I1215 13:54:13.828712 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:13 crc kubenswrapper[4794]: I1215 13:54:13.828725 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:13 crc kubenswrapper[4794]: I1215 13:54:13.867666 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 15 13:54:14 crc kubenswrapper[4794]: I1215 13:54:14.830426 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:14 crc kubenswrapper[4794]: I1215 13:54:14.831845 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:14 crc kubenswrapper[4794]: I1215 13:54:14.831893 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:14 crc kubenswrapper[4794]: I1215 13:54:14.831911 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:14 crc kubenswrapper[4794]: I1215 13:54:14.863315 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:54:14 crc kubenswrapper[4794]: I1215 13:54:14.863619 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:14 crc kubenswrapper[4794]: I1215 13:54:14.865302 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:14 crc kubenswrapper[4794]: I1215 13:54:14.865371 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:14 crc kubenswrapper[4794]: I1215 13:54:14.865395 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:14 crc kubenswrapper[4794]: I1215 13:54:14.870378 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.142531 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.144357 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.144434 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.144452 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.144482 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 15 13:54:15 crc kubenswrapper[4794]: E1215 13:54:15.150004 4794 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.579770 4794 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.681640 4794 apiserver.go:52] "Watching apiserver" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.685007 4794 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.685240 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.685570 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.685615 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 15 13:54:15 crc kubenswrapper[4794]: E1215 13:54:15.685660 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.685756 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.685766 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 15 13:54:15 crc kubenswrapper[4794]: E1215 13:54:15.686380 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.686715 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.688163 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:15 crc kubenswrapper[4794]: E1215 13:54:15.688243 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.690508 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.690548 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.690608 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.690666 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.690780 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.691993 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.692034 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.692147 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.692837 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.731773 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.757047 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.776949 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.785136 4794 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.791824 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.799245 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.811569 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.820610 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:15 crc kubenswrapper[4794]: I1215 13:54:15.842639 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 15 13:54:16 crc kubenswrapper[4794]: I1215 13:54:16.141031 4794 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 15 13:54:16 crc kubenswrapper[4794]: I1215 13:54:16.380431 4794 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 15 13:54:16 crc kubenswrapper[4794]: I1215 13:54:16.835923 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.521897 4794 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.737066 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.737109 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.737071 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.737263 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.737359 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.737521 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.847014 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.850062 4794 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.951615 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.951740 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.951799 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.951848 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.951896 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.952001 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.952066 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.952192 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.952225 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.952350 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.952539 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.952880 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953463 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953486 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953574 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953695 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953738 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953778 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953812 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953846 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953879 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953918 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953906 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953959 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.953993 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954029 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954065 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954099 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954136 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954172 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954205 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954239 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954272 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954356 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954391 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954428 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954465 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954500 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954534 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954573 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954649 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954683 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954719 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954752 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954788 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954825 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954865 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954905 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954938 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954972 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955011 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955044 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955079 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955114 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955154 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955201 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955236 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955298 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955363 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955405 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955439 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955488 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955635 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955675 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955715 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955752 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955787 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955822 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955904 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955938 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955975 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956014 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956048 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956082 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956115 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956152 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956188 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956292 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956330 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956367 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956402 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956438 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956480 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956555 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956653 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956693 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956727 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956765 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956802 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956848 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956885 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956920 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956966 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957002 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957037 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957074 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957109 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957144 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957178 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957212 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957249 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957284 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957318 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957360 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957397 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957433 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957520 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957562 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957627 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957680 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957731 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957818 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957874 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957925 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957966 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958004 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958039 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958074 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958110 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958146 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958179 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958213 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958251 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958287 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958349 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958416 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958479 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958532 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958620 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958669 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958717 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958776 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958826 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958877 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958932 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954818 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954824 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954844 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.954866 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955102 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955243 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955461 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955578 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955607 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955705 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955748 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955832 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955932 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.955957 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956011 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956232 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956262 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956346 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956343 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956385 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956533 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956635 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956670 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956818 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956791 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956874 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.956859 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957064 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957240 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957401 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957494 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957534 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.959553 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957856 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.957902 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958134 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958504 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958983 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.959125 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.959233 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.959558 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.959730 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.959732 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.960135 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.960456 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.960663 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.960771 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.961134 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.961819 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.958997 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.961972 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962008 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962422 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962448 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962472 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962493 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962514 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962536 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962558 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962595 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962617 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962644 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962673 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962701 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962732 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962761 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962784 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962806 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962827 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962851 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962873 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962896 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962922 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962944 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962970 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962990 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963012 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963032 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963052 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963074 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963100 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963135 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963165 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963189 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963209 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963235 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963261 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963285 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963313 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963339 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963364 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963398 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963431 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963458 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963479 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963599 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963637 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963659 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963683 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963706 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963728 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963751 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963773 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963796 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963818 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963840 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963863 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963885 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963916 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963948 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.963981 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964009 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964032 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964053 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964074 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964098 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964119 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964169 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964197 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964222 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964245 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964269 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964293 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964318 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964345 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964369 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964398 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964421 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964445 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964471 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964496 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964567 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964622 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964640 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964653 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964666 4794 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964678 4794 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964693 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964706 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964719 4794 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964734 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964747 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964760 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964773 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964785 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964797 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964810 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964823 4794 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964836 4794 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964848 4794 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964861 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965037 4794 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965051 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965063 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965856 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965873 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965886 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965899 4794 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965911 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965923 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965935 4794 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965947 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965960 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965973 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965984 4794 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965997 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966011 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966023 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966038 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966051 4794 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966063 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966075 4794 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966088 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966100 4794 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966111 4794 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966124 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966135 4794 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966149 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966166 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966179 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966191 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966204 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966217 4794 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966229 4794 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966241 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966252 4794 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966264 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966276 4794 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.961829 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962000 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.962440 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964961 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.964989 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965726 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.965852 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.966013 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.967060 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.967130 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.967231 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.967632 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.967836 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.968014 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.968249 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.968472 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.970149 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.970899 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.971018 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.971026 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.982226 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:54:18.482199522 +0000 UTC m=+20.334221970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.985190 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.985309 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.986841 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.989058 4794 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.990081 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.991772 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.991886 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:18.491832185 +0000 UTC m=+20.343854633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.993076 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.993134 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.993132 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.993456 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.993708 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.993951 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:18.493934975 +0000 UTC m=+20.345957423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.993104 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:17 crc kubenswrapper[4794]: E1215 13:54:17.994019 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:18.494003817 +0000 UTC m=+20.346026265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.994194 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.994635 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.996674 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.997150 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:17 crc kubenswrapper[4794]: I1215 13:54:17.997163 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:17.998871 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.002638 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.005053 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.005216 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.006223 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.006366 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.007643 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.007976 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.008377 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.008293 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.008684 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.008816 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.008975 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.009018 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.009706 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.009758 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.010523 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.011300 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.011348 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.011682 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.011944 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.011975 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.012147 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.012322 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.012809 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.012708 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.013069 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.013322 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.013617 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.013666 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.013932 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.013984 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.014444 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.015069 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.015337 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.016322 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.016893 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.017560 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.017638 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.018015 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.018188 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.018488 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.019022 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.019279 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.019675 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.022026 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.022124 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.022131 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.022179 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.022228 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.022254 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.023664 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.024080 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.024862 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.025259 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.025940 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.026823 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.025372 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.030927 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.031404 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.031898 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.032622 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.032736 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.032919 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.034511 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.034755 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.035680 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.036032 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.036056 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.036844 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.038177 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.039225 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.039431 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.039456 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.039809 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.040063 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.040083 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.040198 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.040427 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:18.540400832 +0000 UTC m=+20.392423280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.040474 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.040337 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.040292 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.040624 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.040797 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.041316 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.041621 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.041690 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.042060 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.042877 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.042086 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.042802 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.043215 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.043233 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.043481 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.044202 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.043912 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.045170 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.045202 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.045391 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.045795 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.045799 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.046133 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.046232 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.046452 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.046466 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.046363 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.046370 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.046616 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.046668 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.046829 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.046896 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.046918 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.050609 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.054145 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.062121 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.066730 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.066539 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.066892 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.066943 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.066955 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.066963 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.066972 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.066980 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.066988 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.066997 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067005 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067015 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067026 4794 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067038 4794 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067048 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067058 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067069 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067079 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067088 4794 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067100 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067111 4794 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067122 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067130 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067138 4794 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067146 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067154 4794 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067162 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067170 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067178 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067186 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067194 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067202 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067211 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067218 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067226 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067235 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067243 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067251 4794 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067259 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067267 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067276 4794 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067286 4794 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067294 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067303 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067314 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067323 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067330 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067338 4794 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067347 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067354 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067362 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067370 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067378 4794 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067386 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067395 4794 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067403 4794 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067411 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067419 4794 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067427 4794 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067434 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067442 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067450 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067458 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067466 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067473 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067481 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067489 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067496 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067505 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067513 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067521 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067530 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067538 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067538 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067546 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067596 4794 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067606 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067615 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067624 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067632 4794 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067640 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067647 4794 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067655 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067663 4794 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067671 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067679 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067687 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067695 4794 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067702 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067710 4794 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067718 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067725 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067734 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067743 4794 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067751 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067760 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067768 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067775 4794 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067783 4794 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067792 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067801 4794 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067809 4794 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067817 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067825 4794 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067833 4794 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067841 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067849 4794 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067858 4794 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067866 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067874 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067882 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067890 4794 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067898 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067907 4794 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067915 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067923 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067932 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067941 4794 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067949 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067958 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067967 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067975 4794 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067983 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067991 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.067999 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068007 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068016 4794 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068024 4794 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068032 4794 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068041 4794 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068049 4794 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068057 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068065 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068074 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068082 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068091 4794 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068099 4794 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068107 4794 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068116 4794 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.068124 4794 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.102076 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 15 13:54:18 crc kubenswrapper[4794]: W1215 13:54:18.116736 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-e501a5e96ad7b9cfa1632b2e7c5bcb61191be7a42b48837329eebe897286a3ac WatchSource:0}: Error finding container e501a5e96ad7b9cfa1632b2e7c5bcb61191be7a42b48837329eebe897286a3ac: Status 404 returned error can't find the container with id e501a5e96ad7b9cfa1632b2e7c5bcb61191be7a42b48837329eebe897286a3ac Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.477159 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.488555 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.506363 4794 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.516757 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.543669 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.576740 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.576801 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.576827 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.576846 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.576861 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.576881 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.576966 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.576979 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.576997 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.577037 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:19.577024987 +0000 UTC m=+21.429047425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.577082 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:54:19.577077228 +0000 UTC m=+21.429099666 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.577118 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.577126 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.577134 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.577151 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:19.57714609 +0000 UTC m=+21.429168528 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.577186 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.577204 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:19.577198742 +0000 UTC m=+21.429221180 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.577226 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:18 crc kubenswrapper[4794]: E1215 13:54:18.577243 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:19.577238223 +0000 UTC m=+21.429260661 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.582798 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.598270 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.598762 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.601905 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.608797 4794 csr.go:261] certificate signing request csr-dzwqs is approved, waiting to be issued Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.626348 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.643936 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.706823 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.714461 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 15 13:54:18 crc kubenswrapper[4794]: W1215 13:54:18.716906 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e4c98c30418a3bb7d2726a77662867e3f92d8e5c94593eaddce62890b25d86a3 WatchSource:0}: Error finding container e4c98c30418a3bb7d2726a77662867e3f92d8e5c94593eaddce62890b25d86a3: Status 404 returned error can't find the container with id e4c98c30418a3bb7d2726a77662867e3f92d8e5c94593eaddce62890b25d86a3 Dec 15 13:54:18 crc kubenswrapper[4794]: W1215 13:54:18.723450 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d844357f145e6028b862616fcb9892589f6690e526df194181e72cb11765bd01 WatchSource:0}: Error finding container d844357f145e6028b862616fcb9892589f6690e526df194181e72cb11765bd01: Status 404 returned error can't find the container with id d844357f145e6028b862616fcb9892589f6690e526df194181e72cb11765bd01 Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.739497 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.740047 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.741216 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.741823 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.742742 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.743247 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.743835 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.744835 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.745394 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.746301 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.746764 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.747859 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.748445 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.748991 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.749903 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.750426 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.751353 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.751713 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.752241 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.753229 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.753656 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.754566 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.754990 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.755942 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.756307 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.756898 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.757904 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.758358 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.759223 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.759646 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.760472 4794 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.760566 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.762125 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.762964 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.763360 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.764759 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.765363 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.766223 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.766846 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.767838 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.768292 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.769487 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.770158 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.771095 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.771514 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.772335 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.772860 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.773889 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.774335 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.775090 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.775527 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.776349 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.776906 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.777327 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.841695 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d844357f145e6028b862616fcb9892589f6690e526df194181e72cb11765bd01"} Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.842466 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e4c98c30418a3bb7d2726a77662867e3f92d8e5c94593eaddce62890b25d86a3"} Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.843624 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e501a5e96ad7b9cfa1632b2e7c5bcb61191be7a42b48837329eebe897286a3ac"} Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.900886 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.904555 4794 csr.go:257] certificate signing request csr-dzwqs is issued Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.917355 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.929066 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.947722 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.958474 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.969085 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.977413 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:18 crc kubenswrapper[4794]: I1215 13:54:18.990827 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.003381 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.013495 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.025482 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.036276 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.047245 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.056633 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.067986 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.076443 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.086208 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.095318 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.119817 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.137740 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.169673 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6xvkj"] Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.170087 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6xvkj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.171405 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.172013 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.174096 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.182810 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.194869 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.204227 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.215411 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.226178 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.236820 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.246540 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.255032 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.263832 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.282217 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fb4cf2c0-63d3-40ed-a0f9-a0c381371407-hosts-file\") pod \"node-resolver-6xvkj\" (UID: \"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\") " pod="openshift-dns/node-resolver-6xvkj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.282257 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl4ln\" (UniqueName: \"kubernetes.io/projected/fb4cf2c0-63d3-40ed-a0f9-a0c381371407-kube-api-access-sl4ln\") pod \"node-resolver-6xvkj\" (UID: \"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\") " pod="openshift-dns/node-resolver-6xvkj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.382647 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fb4cf2c0-63d3-40ed-a0f9-a0c381371407-hosts-file\") pod \"node-resolver-6xvkj\" (UID: \"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\") " pod="openshift-dns/node-resolver-6xvkj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.382689 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl4ln\" (UniqueName: \"kubernetes.io/projected/fb4cf2c0-63d3-40ed-a0f9-a0c381371407-kube-api-access-sl4ln\") pod \"node-resolver-6xvkj\" (UID: \"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\") " pod="openshift-dns/node-resolver-6xvkj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.382819 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fb4cf2c0-63d3-40ed-a0f9-a0c381371407-hosts-file\") pod \"node-resolver-6xvkj\" (UID: \"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\") " pod="openshift-dns/node-resolver-6xvkj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.404672 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl4ln\" (UniqueName: \"kubernetes.io/projected/fb4cf2c0-63d3-40ed-a0f9-a0c381371407-kube-api-access-sl4ln\") pod \"node-resolver-6xvkj\" (UID: \"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\") " pod="openshift-dns/node-resolver-6xvkj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.517336 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6xvkj" Dec 15 13:54:19 crc kubenswrapper[4794]: W1215 13:54:19.532892 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb4cf2c0_63d3_40ed_a0f9_a0c381371407.slice/crio-352ce78f0c40b118ed282709939f031e56c2ec52ed858d7e68be2cebf81af47c WatchSource:0}: Error finding container 352ce78f0c40b118ed282709939f031e56c2ec52ed858d7e68be2cebf81af47c: Status 404 returned error can't find the container with id 352ce78f0c40b118ed282709939f031e56c2ec52ed858d7e68be2cebf81af47c Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.535848 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-cjbhj"] Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.536401 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.537966 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-fq2s6"] Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.538380 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-t9nm7"] Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.538466 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.538700 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.541883 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.542516 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.542810 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.542952 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.543011 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.543175 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.542964 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.543308 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.543186 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.543633 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.544785 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.552409 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.578122 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.583353 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.583410 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.583434 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.583453 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.583473 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583557 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583674 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583691 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583702 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583679 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:21.583658716 +0000 UTC m=+23.435681154 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583739 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583759 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:54:21.583744069 +0000 UTC m=+23.435766507 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583772 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:21.583767099 +0000 UTC m=+23.435789537 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583772 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583780 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583791 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583813 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:21.58380628 +0000 UTC m=+23.435828718 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.583859 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:21.583837141 +0000 UTC m=+23.435859589 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.595137 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.612014 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.622163 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.630626 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.638608 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.646993 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.655806 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.664626 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.672381 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.680466 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684197 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-cnibin\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684307 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3538082f-5d54-4676-a488-7a3df6b9a1f4-rootfs\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684389 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-cni-binary-copy\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684470 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-socket-dir-parent\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684553 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-cnibin\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684680 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a33db367-8090-4973-a405-f5b4c8b8479a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684761 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-etc-kubernetes\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684785 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684821 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-daemon-config\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684844 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-run-k8s-cni-cncf-io\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684865 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-var-lib-cni-bin\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684885 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-var-lib-kubelet\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684935 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxxcn\" (UniqueName: \"kubernetes.io/projected/3538082f-5d54-4676-a488-7a3df6b9a1f4-kube-api-access-sxxcn\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684962 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-conf-dir\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684981 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-system-cni-dir\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.684997 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3538082f-5d54-4676-a488-7a3df6b9a1f4-mcd-auth-proxy-config\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685011 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-os-release\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685031 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-var-lib-cni-multus\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685053 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-hostroot\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685104 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vtn2\" (UniqueName: \"kubernetes.io/projected/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-kube-api-access-9vtn2\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685127 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-os-release\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685147 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a33db367-8090-4973-a405-f5b4c8b8479a-cni-binary-copy\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685185 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-cni-dir\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685201 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-run-netns\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685222 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-run-multus-certs\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685285 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3538082f-5d54-4676-a488-7a3df6b9a1f4-proxy-tls\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685308 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-system-cni-dir\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.685329 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-865qg\" (UniqueName: \"kubernetes.io/projected/a33db367-8090-4973-a405-f5b4c8b8479a-kube-api-access-865qg\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.691153 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.699126 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.708315 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.714768 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.725937 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.734806 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.736051 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.736115 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.736165 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.736214 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.736299 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.736406 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.742296 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.749770 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.757366 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.769962 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.780719 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786157 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vtn2\" (UniqueName: \"kubernetes.io/projected/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-kube-api-access-9vtn2\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786211 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-os-release\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786249 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a33db367-8090-4973-a405-f5b4c8b8479a-cni-binary-copy\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786281 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-run-netns\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786314 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-run-multus-certs\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786344 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-cni-dir\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786388 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3538082f-5d54-4676-a488-7a3df6b9a1f4-proxy-tls\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786421 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-system-cni-dir\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786452 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-865qg\" (UniqueName: \"kubernetes.io/projected/a33db367-8090-4973-a405-f5b4c8b8479a-kube-api-access-865qg\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786497 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-cnibin\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786529 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3538082f-5d54-4676-a488-7a3df6b9a1f4-rootfs\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786559 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-cni-binary-copy\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786620 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-socket-dir-parent\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786668 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-cnibin\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786699 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-etc-kubernetes\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786709 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-os-release\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786730 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a33db367-8090-4973-a405-f5b4c8b8479a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786760 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-run-netns\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786762 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786526 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-run-multus-certs\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786786 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-cnibin\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786804 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-daemon-config\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786845 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3538082f-5d54-4676-a488-7a3df6b9a1f4-rootfs\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786917 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-etc-kubernetes\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786880 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-run-k8s-cni-cncf-io\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786987 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-socket-dir-parent\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.786986 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-var-lib-cni-bin\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787027 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-var-lib-cni-bin\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787036 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-var-lib-kubelet\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787063 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxxcn\" (UniqueName: \"kubernetes.io/projected/3538082f-5d54-4676-a488-7a3df6b9a1f4-kube-api-access-sxxcn\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787076 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-system-cni-dir\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787086 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-conf-dir\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787295 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-system-cni-dir\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787328 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3538082f-5d54-4676-a488-7a3df6b9a1f4-mcd-auth-proxy-config\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787374 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-os-release\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787395 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-var-lib-kubelet\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787439 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-var-lib-cni-multus\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787720 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-cni-binary-copy\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787818 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-run-k8s-cni-cncf-io\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787903 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-cnibin\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787910 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-system-cni-dir\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787404 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-host-var-lib-cni-multus\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.787974 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-hostroot\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.788035 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-hostroot\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.788127 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-daemon-config\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.788186 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-cni-dir\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.788368 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a33db367-8090-4973-a405-f5b4c8b8479a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.788472 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-os-release\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.788512 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-multus-conf-dir\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.789180 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3538082f-5d54-4676-a488-7a3df6b9a1f4-mcd-auth-proxy-config\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.789216 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a33db367-8090-4973-a405-f5b4c8b8479a-cni-binary-copy\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.793302 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3538082f-5d54-4676-a488-7a3df6b9a1f4-proxy-tls\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.811645 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vtn2\" (UniqueName: \"kubernetes.io/projected/0bc89ecc-eb8e-4926-bbb7-14c90f449e00-kube-api-access-9vtn2\") pod \"multus-t9nm7\" (UID: \"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\") " pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.811882 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-865qg\" (UniqueName: \"kubernetes.io/projected/a33db367-8090-4973-a405-f5b4c8b8479a-kube-api-access-865qg\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.814335 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxxcn\" (UniqueName: \"kubernetes.io/projected/3538082f-5d54-4676-a488-7a3df6b9a1f4-kube-api-access-sxxcn\") pod \"machine-config-daemon-fq2s6\" (UID: \"3538082f-5d54-4676-a488-7a3df6b9a1f4\") " pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.847918 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6xvkj" event={"ID":"fb4cf2c0-63d3-40ed-a0f9-a0c381371407","Type":"ContainerStarted","Data":"352ce78f0c40b118ed282709939f031e56c2ec52ed858d7e68be2cebf81af47c"} Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.849805 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198"} Dec 15 13:54:19 crc kubenswrapper[4794]: E1215 13:54:19.857155 4794 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.865659 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.871681 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t9nm7" Dec 15 13:54:19 crc kubenswrapper[4794]: W1215 13:54:19.885774 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3538082f_5d54_4676_a488_7a3df6b9a1f4.slice/crio-4131177eed61849f2e2ea0b49ce25e348075f1b16cfec3d8010a08da39fe96de WatchSource:0}: Error finding container 4131177eed61849f2e2ea0b49ce25e348075f1b16cfec3d8010a08da39fe96de: Status 404 returned error can't find the container with id 4131177eed61849f2e2ea0b49ce25e348075f1b16cfec3d8010a08da39fe96de Dec 15 13:54:19 crc kubenswrapper[4794]: W1215 13:54:19.889184 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc89ecc_eb8e_4926_bbb7_14c90f449e00.slice/crio-dcc6341254be5c314e81222d6f2a85b066944b56173e7054d093517bc7fc22a1 WatchSource:0}: Error finding container dcc6341254be5c314e81222d6f2a85b066944b56173e7054d093517bc7fc22a1: Status 404 returned error can't find the container with id dcc6341254be5c314e81222d6f2a85b066944b56173e7054d093517bc7fc22a1 Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.905290 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-15 13:49:18 +0000 UTC, rotation deadline is 2026-10-23 05:05:20.932152157 +0000 UTC Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.905571 4794 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7479h11m1.026589664s for next certificate rotation Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.908384 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cwnfl"] Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.910189 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.914186 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.914575 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.914758 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.914771 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.916632 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.917080 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 15 13:54:19 crc kubenswrapper[4794]: I1215 13:54:19.919059 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.990425 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-log-socket\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.990566 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-env-overrides\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.990667 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-config\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.990719 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-netns\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.990751 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-var-lib-openvswitch\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.990785 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-etc-openvswitch\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.990809 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.990834 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-script-lib\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.990860 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-openvswitch\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.990946 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-systemd-units\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.991194 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-slash\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.991285 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxv8r\" (UniqueName: \"kubernetes.io/projected/628fdda9-19ac-4a1d-a93b-82a10124a8ad-kube-api-access-pxv8r\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.991402 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-systemd\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.991443 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-node-log\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.991475 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-bin\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.991530 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-kubelet\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.991609 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-ovn\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.991636 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-netd\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.991663 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovn-node-metrics-cert\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:19.991751 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.093489 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-env-overrides\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.093687 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-config\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.093905 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-etc-openvswitch\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.093981 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-netns\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.094080 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-var-lib-openvswitch\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.094137 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-netns\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.094176 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-var-lib-openvswitch\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.094282 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-etc-openvswitch\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.094305 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-openvswitch\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.094430 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-openvswitch\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.094619 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.094722 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.094718 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-script-lib\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.094909 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-systemd-units\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.094958 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-systemd-units\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095003 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-slash\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095071 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-slash\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095163 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxv8r\" (UniqueName: \"kubernetes.io/projected/628fdda9-19ac-4a1d-a93b-82a10124a8ad-kube-api-access-pxv8r\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095287 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-bin\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095341 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-bin\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095353 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-systemd\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095414 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-systemd\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095472 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-node-log\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095576 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-node-log\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095617 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-kubelet\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095668 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-kubelet\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095740 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-ovn\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095798 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-netd\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095842 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovn-node-metrics-cert\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095874 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-ovn\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095916 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.095946 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.096036 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-log-socket\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.096101 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-log-socket\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.096686 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-netd\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.101340 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovn-node-metrics-cert\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.110012 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a33db367-8090-4973-a405-f5b4c8b8479a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cjbhj\" (UID: \"a33db367-8090-4973-a405-f5b4c8b8479a\") " pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.110114 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-env-overrides\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.113039 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-config\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.113094 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-script-lib\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.113296 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxv8r\" (UniqueName: \"kubernetes.io/projected/628fdda9-19ac-4a1d-a93b-82a10124a8ad-kube-api-access-pxv8r\") pod \"ovnkube-node-cwnfl\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.125143 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.150094 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.158752 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.161728 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.172149 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.183085 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.196342 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.207247 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.216740 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.228875 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.250168 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.253512 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.265110 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.273640 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.281057 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: W1215 13:54:20.536135 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda33db367_8090_4973_a405_f5b4c8b8479a.slice/crio-92bc64b45c6a5ca3fba08932823b30bda615aad06e50ba6b0f25b5cf6d4cfd61 WatchSource:0}: Error finding container 92bc64b45c6a5ca3fba08932823b30bda615aad06e50ba6b0f25b5cf6d4cfd61: Status 404 returned error can't find the container with id 92bc64b45c6a5ca3fba08932823b30bda615aad06e50ba6b0f25b5cf6d4cfd61 Dec 15 13:54:20 crc kubenswrapper[4794]: W1215 13:54:20.536935 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod628fdda9_19ac_4a1d_a93b_82a10124a8ad.slice/crio-2af415f6f69391cfb1f68c3874b3b2fce7fa22a82cbab5a1d7b70729c0713f85 WatchSource:0}: Error finding container 2af415f6f69391cfb1f68c3874b3b2fce7fa22a82cbab5a1d7b70729c0713f85: Status 404 returned error can't find the container with id 2af415f6f69391cfb1f68c3874b3b2fce7fa22a82cbab5a1d7b70729c0713f85 Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.853612 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"2af415f6f69391cfb1f68c3874b3b2fce7fa22a82cbab5a1d7b70729c0713f85"} Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.855458 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerStarted","Data":"92bc64b45c6a5ca3fba08932823b30bda615aad06e50ba6b0f25b5cf6d4cfd61"} Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.856809 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9nm7" event={"ID":"0bc89ecc-eb8e-4926-bbb7-14c90f449e00","Type":"ContainerStarted","Data":"dcc6341254be5c314e81222d6f2a85b066944b56173e7054d093517bc7fc22a1"} Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.857973 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"4131177eed61849f2e2ea0b49ce25e348075f1b16cfec3d8010a08da39fe96de"} Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.869213 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.884750 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.900121 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.912253 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.924217 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.936690 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.951983 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.966057 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.981500 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:20 crc kubenswrapper[4794]: I1215 13:54:20.996672 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.004106 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bpwnn"] Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.004533 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bpwnn" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.007882 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.008190 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.008438 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.011506 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.018156 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.033462 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.066717 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.077564 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.088183 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.098049 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.104628 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.107071 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/abd16c69-cba3-49a5-adc8-92a14453db80-host\") pod \"node-ca-bpwnn\" (UID: \"abd16c69-cba3-49a5-adc8-92a14453db80\") " pod="openshift-image-registry/node-ca-bpwnn" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.107110 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/abd16c69-cba3-49a5-adc8-92a14453db80-serviceca\") pod \"node-ca-bpwnn\" (UID: \"abd16c69-cba3-49a5-adc8-92a14453db80\") " pod="openshift-image-registry/node-ca-bpwnn" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.107158 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqk66\" (UniqueName: \"kubernetes.io/projected/abd16c69-cba3-49a5-adc8-92a14453db80-kube-api-access-jqk66\") pod \"node-ca-bpwnn\" (UID: \"abd16c69-cba3-49a5-adc8-92a14453db80\") " pod="openshift-image-registry/node-ca-bpwnn" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.114208 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.129183 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.141441 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.155726 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.171529 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.182869 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.194212 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.203187 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.207818 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/abd16c69-cba3-49a5-adc8-92a14453db80-host\") pod \"node-ca-bpwnn\" (UID: \"abd16c69-cba3-49a5-adc8-92a14453db80\") " pod="openshift-image-registry/node-ca-bpwnn" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.207895 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/abd16c69-cba3-49a5-adc8-92a14453db80-serviceca\") pod \"node-ca-bpwnn\" (UID: \"abd16c69-cba3-49a5-adc8-92a14453db80\") " pod="openshift-image-registry/node-ca-bpwnn" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.207951 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqk66\" (UniqueName: \"kubernetes.io/projected/abd16c69-cba3-49a5-adc8-92a14453db80-kube-api-access-jqk66\") pod \"node-ca-bpwnn\" (UID: \"abd16c69-cba3-49a5-adc8-92a14453db80\") " pod="openshift-image-registry/node-ca-bpwnn" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.207985 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/abd16c69-cba3-49a5-adc8-92a14453db80-host\") pod \"node-ca-bpwnn\" (UID: \"abd16c69-cba3-49a5-adc8-92a14453db80\") " pod="openshift-image-registry/node-ca-bpwnn" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.209316 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/abd16c69-cba3-49a5-adc8-92a14453db80-serviceca\") pod \"node-ca-bpwnn\" (UID: \"abd16c69-cba3-49a5-adc8-92a14453db80\") " pod="openshift-image-registry/node-ca-bpwnn" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.218509 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.225687 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.229574 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqk66\" (UniqueName: \"kubernetes.io/projected/abd16c69-cba3-49a5-adc8-92a14453db80-kube-api-access-jqk66\") pod \"node-ca-bpwnn\" (UID: \"abd16c69-cba3-49a5-adc8-92a14453db80\") " pod="openshift-image-registry/node-ca-bpwnn" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.320219 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bpwnn" Dec 15 13:54:21 crc kubenswrapper[4794]: W1215 13:54:21.341353 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabd16c69_cba3_49a5_adc8_92a14453db80.slice/crio-c409c7d9358214cdab1105bfe3a774906defb6a923be45d4fa8d4af6cfc04ccc WatchSource:0}: Error finding container c409c7d9358214cdab1105bfe3a774906defb6a923be45d4fa8d4af6cfc04ccc: Status 404 returned error can't find the container with id c409c7d9358214cdab1105bfe3a774906defb6a923be45d4fa8d4af6cfc04ccc Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.550718 4794 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.551736 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.551761 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.551769 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.551844 4794 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.559721 4794 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.559961 4794 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.560754 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.560780 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.560796 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.560813 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.560824 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:21Z","lastTransitionTime":"2025-12-15T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.582044 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.585464 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.585538 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.585564 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.585623 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.585661 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:21Z","lastTransitionTime":"2025-12-15T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.599833 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.603788 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.603809 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.603818 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.603831 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.603840 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:21Z","lastTransitionTime":"2025-12-15T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.612482 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.612541 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.612565 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.612663 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.612666 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:54:25.612633572 +0000 UTC m=+27.464656030 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.612753 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.612761 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:25.612735995 +0000 UTC m=+27.464758453 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.612868 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.612913 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.612999 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.613038 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.613058 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.613072 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.613097 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.613004 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:25.612985852 +0000 UTC m=+27.465008340 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.613116 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.613129 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:25.613115226 +0000 UTC m=+27.465137674 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.613183 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:25.613156287 +0000 UTC m=+27.465178765 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.614642 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.618551 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.618617 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.618638 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.618660 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.618676 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:21Z","lastTransitionTime":"2025-12-15T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.630146 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.637015 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.637065 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.637093 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.637117 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.637136 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:21Z","lastTransitionTime":"2025-12-15T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.654294 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.654448 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.656805 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.656847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.656863 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.656883 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.656897 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:21Z","lastTransitionTime":"2025-12-15T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.737104 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.737171 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.737128 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.737300 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.737416 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:21 crc kubenswrapper[4794]: E1215 13:54:21.737572 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.759659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.759705 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.759720 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.759740 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.759758 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:21Z","lastTransitionTime":"2025-12-15T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.862123 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.862203 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.862221 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.862248 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.862264 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:21Z","lastTransitionTime":"2025-12-15T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.863151 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bpwnn" event={"ID":"abd16c69-cba3-49a5-adc8-92a14453db80","Type":"ContainerStarted","Data":"c409c7d9358214cdab1105bfe3a774906defb6a923be45d4fa8d4af6cfc04ccc"} Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.865703 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768"} Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.965846 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.965895 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.965907 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.965924 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:21 crc kubenswrapper[4794]: I1215 13:54:21.965937 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:21Z","lastTransitionTime":"2025-12-15T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.069784 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.069865 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.069894 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.069926 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.069951 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:22Z","lastTransitionTime":"2025-12-15T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.173788 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.173828 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.173837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.173853 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.173863 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:22Z","lastTransitionTime":"2025-12-15T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.275674 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.275729 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.275738 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.275750 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.275760 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:22Z","lastTransitionTime":"2025-12-15T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.378233 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.378511 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.378689 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.378899 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.379098 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:22Z","lastTransitionTime":"2025-12-15T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.481498 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.482169 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.482403 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.482569 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.482817 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:22Z","lastTransitionTime":"2025-12-15T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.585022 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.585276 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.585390 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.585483 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.585556 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:22Z","lastTransitionTime":"2025-12-15T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.687647 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.687814 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.687904 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.688009 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.688088 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:22Z","lastTransitionTime":"2025-12-15T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.790261 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.790303 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.790314 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.790329 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.790341 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:22Z","lastTransitionTime":"2025-12-15T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.871531 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bpwnn" event={"ID":"abd16c69-cba3-49a5-adc8-92a14453db80","Type":"ContainerStarted","Data":"a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.874890 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.876866 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2" exitCode=0 Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.876965 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.880151 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerStarted","Data":"457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.882920 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6xvkj" event={"ID":"fb4cf2c0-63d3-40ed-a0f9-a0c381371407","Type":"ContainerStarted","Data":"63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.885175 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9nm7" event={"ID":"0bc89ecc-eb8e-4926-bbb7-14c90f449e00","Type":"ContainerStarted","Data":"babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.887294 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.893071 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.893119 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.893133 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.893152 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.893166 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:22Z","lastTransitionTime":"2025-12-15T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.894158 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.903406 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.915254 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.925839 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.943988 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.962919 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.979853 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.993514 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.995673 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.995728 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.995746 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.995771 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:22 crc kubenswrapper[4794]: I1215 13:54:22.995790 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:22Z","lastTransitionTime":"2025-12-15T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.000937 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.020976 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.035419 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.053237 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.064670 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.076305 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.085273 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.093690 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.097768 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.097810 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.097822 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.097840 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.097852 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:23Z","lastTransitionTime":"2025-12-15T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.105873 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.114638 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.126901 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.140798 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.151891 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.182022 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.200209 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.200250 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.200259 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.200271 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.200281 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:23Z","lastTransitionTime":"2025-12-15T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.208042 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.225527 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.236808 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.250628 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.266795 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.274057 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.302470 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.302508 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.302518 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.302530 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.302539 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:23Z","lastTransitionTime":"2025-12-15T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.405336 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.405373 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.405384 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.405398 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.405407 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:23Z","lastTransitionTime":"2025-12-15T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.507855 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.507904 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.507939 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.507957 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.507969 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:23Z","lastTransitionTime":"2025-12-15T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.610734 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.610785 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.610793 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.610807 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.610816 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:23Z","lastTransitionTime":"2025-12-15T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.713404 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.713444 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.713461 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.713479 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.713492 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:23Z","lastTransitionTime":"2025-12-15T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.736618 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:23 crc kubenswrapper[4794]: E1215 13:54:23.736751 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.736647 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.736628 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:23 crc kubenswrapper[4794]: E1215 13:54:23.736837 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:23 crc kubenswrapper[4794]: E1215 13:54:23.736954 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.816023 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.816053 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.816061 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.816073 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.816082 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:23Z","lastTransitionTime":"2025-12-15T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.891570 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.894220 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.896783 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.896844 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.898484 4794 generic.go:334] "Generic (PLEG): container finished" podID="a33db367-8090-4973-a405-f5b4c8b8479a" containerID="457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b" exitCode=0 Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.898608 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerDied","Data":"457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.913062 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:23Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.920827 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.920860 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.920873 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.920891 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.920905 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:23Z","lastTransitionTime":"2025-12-15T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.941974 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:23Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.955149 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:23Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.965990 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:23Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.979907 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:23Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:23 crc kubenswrapper[4794]: I1215 13:54:23.996117 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:23Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.012748 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.022790 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.022831 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.022847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.022864 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.022876 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:24Z","lastTransitionTime":"2025-12-15T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.027029 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.038245 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.060230 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.075226 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.098812 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.110244 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.123255 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.126164 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.126198 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.126225 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.126240 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.126249 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:24Z","lastTransitionTime":"2025-12-15T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.132519 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.144767 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.152751 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.162935 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.173814 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.190305 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.213872 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.228710 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.228791 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.228816 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.228845 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.228884 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:24Z","lastTransitionTime":"2025-12-15T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.230931 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.247610 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.272028 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.281520 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.292933 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.306401 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.323496 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.331980 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.332025 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.332042 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.332065 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.332082 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:24Z","lastTransitionTime":"2025-12-15T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.436950 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.437006 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.437020 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.437036 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.437065 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:24Z","lastTransitionTime":"2025-12-15T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.540974 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.541235 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.541245 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.541258 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.541268 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:24Z","lastTransitionTime":"2025-12-15T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.643463 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.643494 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.643503 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.643518 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.643527 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:24Z","lastTransitionTime":"2025-12-15T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.746186 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.746230 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.746245 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.746262 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.746272 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:24Z","lastTransitionTime":"2025-12-15T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.848313 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.848360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.848372 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.848390 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.848409 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:24Z","lastTransitionTime":"2025-12-15T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.904466 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerStarted","Data":"4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.912817 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.912854 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.912863 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.912873 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.918031 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.931949 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.946320 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.961259 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.961303 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.961315 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.961330 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.961343 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:24Z","lastTransitionTime":"2025-12-15T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.962419 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.978154 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:24 crc kubenswrapper[4794]: I1215 13:54:24.996066 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:24Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.011323 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.024187 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.040292 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.052003 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.064153 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.064194 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.064208 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.064225 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.064240 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:25Z","lastTransitionTime":"2025-12-15T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.064762 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.074254 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.084163 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.094248 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.166264 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.166292 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.166301 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.166313 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.166321 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:25Z","lastTransitionTime":"2025-12-15T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.268165 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.268192 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.268200 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.268212 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.268222 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:25Z","lastTransitionTime":"2025-12-15T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.370655 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.370696 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.370707 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.370723 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.370734 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:25Z","lastTransitionTime":"2025-12-15T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.473910 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.473955 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.473964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.473978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.473987 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:25Z","lastTransitionTime":"2025-12-15T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.576149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.576177 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.576185 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.576197 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.576206 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:25Z","lastTransitionTime":"2025-12-15T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.653558 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.653681 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.653719 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.653834 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.653873 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:54:33.653841869 +0000 UTC m=+35.505864317 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.653890 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.653884 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.653929 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.653927 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.653941 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.653993 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.654031 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:33.654018594 +0000 UTC m=+35.506041032 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.654122 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.653948 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.654166 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:33.654155448 +0000 UTC m=+35.506178006 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.653979 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.654241 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:33.65421744 +0000 UTC m=+35.506239908 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.654312 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:33.654282482 +0000 UTC m=+35.506304950 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.678838 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.678902 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.678925 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.678954 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.678978 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:25Z","lastTransitionTime":"2025-12-15T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.736532 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.736698 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.736699 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.736789 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.736810 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:25 crc kubenswrapper[4794]: E1215 13:54:25.737000 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.781202 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.781247 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.781257 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.781272 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.781283 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:25Z","lastTransitionTime":"2025-12-15T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.884035 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.884085 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.884102 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.884122 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.884138 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:25Z","lastTransitionTime":"2025-12-15T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.918137 4794 generic.go:334] "Generic (PLEG): container finished" podID="a33db367-8090-4973-a405-f5b4c8b8479a" containerID="4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783" exitCode=0 Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.918226 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerDied","Data":"4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783"} Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.943566 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.957174 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:25 crc kubenswrapper[4794]: I1215 13:54:25.973924 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:25.994732 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:25Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.006807 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.006854 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.006864 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.006881 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.006891 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:26Z","lastTransitionTime":"2025-12-15T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.032890 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.061361 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.077574 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.099395 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.108622 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.108665 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.108678 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.108695 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.108721 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:26Z","lastTransitionTime":"2025-12-15T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.110810 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.121449 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.143938 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.156521 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.170642 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.186169 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.210636 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.210670 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.210681 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.210696 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.210710 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:26Z","lastTransitionTime":"2025-12-15T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.313362 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.313426 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.313440 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.313459 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.313472 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:26Z","lastTransitionTime":"2025-12-15T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.415773 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.415837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.415857 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.415883 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.415903 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:26Z","lastTransitionTime":"2025-12-15T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.518911 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.518969 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.518985 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.519020 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.519035 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:26Z","lastTransitionTime":"2025-12-15T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.622387 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.622441 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.622459 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.622483 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.622500 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:26Z","lastTransitionTime":"2025-12-15T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.724912 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.724972 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.724990 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.725014 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.725032 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:26Z","lastTransitionTime":"2025-12-15T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.828803 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.828871 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.828895 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.828926 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.828949 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:26Z","lastTransitionTime":"2025-12-15T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.925707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerStarted","Data":"dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452"} Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.931298 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.931362 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.931387 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.931417 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.931441 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:26Z","lastTransitionTime":"2025-12-15T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.946441 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.962559 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.985715 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:26 crc kubenswrapper[4794]: I1215 13:54:26.998627 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.017278 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.026938 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.033741 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.033773 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.033788 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.033802 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.033811 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:27Z","lastTransitionTime":"2025-12-15T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.039687 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.051628 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.068814 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.082706 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.094189 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.104695 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.114558 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.127033 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.135921 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.135947 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.135955 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.135970 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.135981 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:27Z","lastTransitionTime":"2025-12-15T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.238561 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.238614 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.238624 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.238643 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.238653 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:27Z","lastTransitionTime":"2025-12-15T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.342082 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.342132 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.342149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.342176 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.342197 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:27Z","lastTransitionTime":"2025-12-15T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.445366 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.445481 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.445535 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.445564 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.445632 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:27Z","lastTransitionTime":"2025-12-15T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.548238 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.548309 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.548327 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.548350 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.548368 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:27Z","lastTransitionTime":"2025-12-15T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.651783 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.651816 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.651823 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.651836 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.651844 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:27Z","lastTransitionTime":"2025-12-15T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.737009 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.737090 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.737124 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:27 crc kubenswrapper[4794]: E1215 13:54:27.737312 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:27 crc kubenswrapper[4794]: E1215 13:54:27.737467 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:27 crc kubenswrapper[4794]: E1215 13:54:27.737617 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.755994 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.756033 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.756042 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.756055 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.756064 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:27Z","lastTransitionTime":"2025-12-15T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.859064 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.859105 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.859118 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.859135 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.859149 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:27Z","lastTransitionTime":"2025-12-15T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.935817 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.938985 4794 generic.go:334] "Generic (PLEG): container finished" podID="a33db367-8090-4973-a405-f5b4c8b8479a" containerID="dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452" exitCode=0 Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.939049 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerDied","Data":"dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.956108 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.963412 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.963462 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.963479 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.963502 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.963519 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:27Z","lastTransitionTime":"2025-12-15T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.974521 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:27 crc kubenswrapper[4794]: I1215 13:54:27.992059 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.009119 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.028229 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.050468 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.066399 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.066448 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.066460 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.066478 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.066490 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:28Z","lastTransitionTime":"2025-12-15T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.081481 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.113880 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.124842 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.138532 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.154868 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.169110 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.169144 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.169156 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.169170 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.169179 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:28Z","lastTransitionTime":"2025-12-15T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.170304 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.184902 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.197130 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.271359 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.271402 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.271414 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.271430 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.271441 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:28Z","lastTransitionTime":"2025-12-15T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.374264 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.374320 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.374338 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.374364 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.374381 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:28Z","lastTransitionTime":"2025-12-15T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.476955 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.477015 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.477032 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.477057 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.477110 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:28Z","lastTransitionTime":"2025-12-15T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.574720 4794 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.581477 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.581534 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.581556 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.581621 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.581646 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:28Z","lastTransitionTime":"2025-12-15T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.685552 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.685650 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.685669 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.685693 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.685712 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:28Z","lastTransitionTime":"2025-12-15T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.756910 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.777246 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.788670 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.788710 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.788722 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.788742 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.788754 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:28Z","lastTransitionTime":"2025-12-15T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.797711 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.814508 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.831865 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.856954 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.877031 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.900093 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.900726 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.900790 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.900815 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.900850 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.900874 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:28Z","lastTransitionTime":"2025-12-15T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.918853 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.937740 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.954283 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.968559 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:28 crc kubenswrapper[4794]: I1215 13:54:28.998051 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.003199 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.003239 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.003249 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.003266 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.003277 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:29Z","lastTransitionTime":"2025-12-15T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.010633 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:29Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.106492 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.106563 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.106617 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.106652 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.106673 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:29Z","lastTransitionTime":"2025-12-15T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.209550 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.209645 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.209670 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.209701 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.209721 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:29Z","lastTransitionTime":"2025-12-15T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.313353 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.313407 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.313422 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.313446 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.313462 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:29Z","lastTransitionTime":"2025-12-15T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.417212 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.417318 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.417346 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.417383 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.417412 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:29Z","lastTransitionTime":"2025-12-15T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.519358 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.519424 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.519442 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.519470 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.519488 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:29Z","lastTransitionTime":"2025-12-15T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.622239 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.622330 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.622366 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.622391 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.622412 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:29Z","lastTransitionTime":"2025-12-15T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.725825 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.725910 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.725936 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.725985 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.726108 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:29Z","lastTransitionTime":"2025-12-15T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.736857 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.736890 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.736864 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:29 crc kubenswrapper[4794]: E1215 13:54:29.737056 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:29 crc kubenswrapper[4794]: E1215 13:54:29.737210 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:29 crc kubenswrapper[4794]: E1215 13:54:29.737329 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.828984 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.829064 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.829092 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.829430 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.829450 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:29Z","lastTransitionTime":"2025-12-15T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.932864 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.932912 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.932920 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.932937 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:29 crc kubenswrapper[4794]: I1215 13:54:29.932950 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:29Z","lastTransitionTime":"2025-12-15T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.037540 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.037665 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.037694 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.037728 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.037764 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:30Z","lastTransitionTime":"2025-12-15T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.141128 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.141503 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.141515 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.141536 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.141549 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:30Z","lastTransitionTime":"2025-12-15T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.244918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.244968 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.244987 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.245027 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.245043 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:30Z","lastTransitionTime":"2025-12-15T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.347881 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.347946 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.347969 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.347998 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.348020 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:30Z","lastTransitionTime":"2025-12-15T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.450526 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.450612 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.450631 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.450654 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.450670 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:30Z","lastTransitionTime":"2025-12-15T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.553430 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.553466 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.553475 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.553489 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.553499 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:30Z","lastTransitionTime":"2025-12-15T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.656559 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.656667 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.656694 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.656730 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.656769 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:30Z","lastTransitionTime":"2025-12-15T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.759874 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.759924 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.759937 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.759955 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.759968 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:30Z","lastTransitionTime":"2025-12-15T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.864176 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.864225 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.864242 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.864265 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.864281 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:30Z","lastTransitionTime":"2025-12-15T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.954322 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerStarted","Data":"d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718"} Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.965998 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.966026 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.966036 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.966052 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:30 crc kubenswrapper[4794]: I1215 13:54:30.966065 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:30Z","lastTransitionTime":"2025-12-15T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.080838 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.081184 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.081202 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.081226 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.081244 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:31Z","lastTransitionTime":"2025-12-15T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.184851 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.184913 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.184931 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.184956 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.184973 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:31Z","lastTransitionTime":"2025-12-15T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.287490 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.287551 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.287568 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.287625 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.287643 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:31Z","lastTransitionTime":"2025-12-15T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.390103 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.390181 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.390208 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.390243 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.390271 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:31Z","lastTransitionTime":"2025-12-15T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.493362 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.493439 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.493457 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.493489 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.493507 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:31Z","lastTransitionTime":"2025-12-15T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.652388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.652448 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.652465 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.652490 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.652510 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:31Z","lastTransitionTime":"2025-12-15T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.736814 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.736887 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.736910 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:31 crc kubenswrapper[4794]: E1215 13:54:31.736999 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:31 crc kubenswrapper[4794]: E1215 13:54:31.737112 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:31 crc kubenswrapper[4794]: E1215 13:54:31.737297 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.754999 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.755056 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.755076 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.755100 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.755117 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:31Z","lastTransitionTime":"2025-12-15T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.858406 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.858451 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.858463 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.858481 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.858497 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:31Z","lastTransitionTime":"2025-12-15T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.924081 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.924111 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.924126 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.924141 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.924153 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:31Z","lastTransitionTime":"2025-12-15T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:31 crc kubenswrapper[4794]: E1215 13:54:31.946066 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:31Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.951850 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.951907 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.951930 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.951962 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.951980 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:31Z","lastTransitionTime":"2025-12-15T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:31 crc kubenswrapper[4794]: E1215 13:54:31.977928 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:31Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.982889 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.983037 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.983056 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.983082 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:31 crc kubenswrapper[4794]: I1215 13:54:31.983101 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:31Z","lastTransitionTime":"2025-12-15T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: E1215 13:54:32.003376 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.009208 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.009440 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.009646 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.009818 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.009963 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: E1215 13:54:32.033026 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.038116 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.038291 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.038546 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.038750 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.038889 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: E1215 13:54:32.052402 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: E1215 13:54:32.053168 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.055137 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.055296 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.055400 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.055561 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.055706 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.159794 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.159866 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.159886 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.159910 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.159927 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.263302 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.263373 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.263392 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.263417 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.263434 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.366393 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.366457 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.366475 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.366503 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.366521 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.375016 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn"] Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.375670 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.378024 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.378289 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.394851 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.415057 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.430946 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.442435 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08abf0e4-50ec-4ee1-a927-e6383b60ab38-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.442513 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08abf0e4-50ec-4ee1-a927-e6383b60ab38-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.442551 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt52p\" (UniqueName: \"kubernetes.io/projected/08abf0e4-50ec-4ee1-a927-e6383b60ab38-kube-api-access-xt52p\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.442795 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08abf0e4-50ec-4ee1-a927-e6383b60ab38-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.446279 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.465988 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.469338 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.469390 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.469408 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.469432 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.469449 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.490455 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.514023 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.527949 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.542438 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.543844 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08abf0e4-50ec-4ee1-a927-e6383b60ab38-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.543891 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08abf0e4-50ec-4ee1-a927-e6383b60ab38-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.543932 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt52p\" (UniqueName: \"kubernetes.io/projected/08abf0e4-50ec-4ee1-a927-e6383b60ab38-kube-api-access-xt52p\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.543983 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08abf0e4-50ec-4ee1-a927-e6383b60ab38-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.544704 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08abf0e4-50ec-4ee1-a927-e6383b60ab38-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.544966 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08abf0e4-50ec-4ee1-a927-e6383b60ab38-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.551315 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08abf0e4-50ec-4ee1-a927-e6383b60ab38-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.567854 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.570977 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt52p\" (UniqueName: \"kubernetes.io/projected/08abf0e4-50ec-4ee1-a927-e6383b60ab38-kube-api-access-xt52p\") pod \"ovnkube-control-plane-749d76644c-4d4wn\" (UID: \"08abf0e4-50ec-4ee1-a927-e6383b60ab38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.571471 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.571506 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.571522 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.571544 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.571561 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.579954 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.610835 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.662205 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.674524 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.674602 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.674625 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.674650 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.674665 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.677493 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.689305 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.696795 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" Dec 15 13:54:32 crc kubenswrapper[4794]: W1215 13:54:32.707980 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08abf0e4_50ec_4ee1_a927_e6383b60ab38.slice/crio-7a401d205efb1606019eb1c712f5bc84fd16b776c0b03ce9160bd56b1bd51900 WatchSource:0}: Error finding container 7a401d205efb1606019eb1c712f5bc84fd16b776c0b03ce9160bd56b1bd51900: Status 404 returned error can't find the container with id 7a401d205efb1606019eb1c712f5bc84fd16b776c0b03ce9160bd56b1bd51900 Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.776957 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.777001 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.777012 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.777027 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.777039 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.878773 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.878810 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.878821 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.878837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.878848 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.965498 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" event={"ID":"08abf0e4-50ec-4ee1-a927-e6383b60ab38","Type":"ContainerStarted","Data":"7a401d205efb1606019eb1c712f5bc84fd16b776c0b03ce9160bd56b1bd51900"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.971908 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.975953 4794 generic.go:334] "Generic (PLEG): container finished" podID="a33db367-8090-4973-a405-f5b4c8b8479a" containerID="d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718" exitCode=0 Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.975991 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerDied","Data":"d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.981254 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.981318 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.981340 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.981369 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.981392 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:32Z","lastTransitionTime":"2025-12-15T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:32 crc kubenswrapper[4794]: I1215 13:54:32.993304 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:32Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.012888 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.029158 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.042469 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.054069 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.067516 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.082553 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.083630 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.083663 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.083677 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.083696 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.083708 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:33Z","lastTransitionTime":"2025-12-15T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.094771 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.106657 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.118698 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.135864 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.151023 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.169153 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.185628 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.185670 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.185684 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.185704 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.185721 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:33Z","lastTransitionTime":"2025-12-15T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.197494 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.207114 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.288349 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.288388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.288397 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.288411 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.288421 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:33Z","lastTransitionTime":"2025-12-15T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.391740 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.391817 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.391835 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.391860 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.391877 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:33Z","lastTransitionTime":"2025-12-15T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.494546 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.494618 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.494635 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.494659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.494675 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:33Z","lastTransitionTime":"2025-12-15T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.597870 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.597927 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.597946 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.597976 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.597994 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:33Z","lastTransitionTime":"2025-12-15T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.654428 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.654659 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.654728 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:54:49.654688888 +0000 UTC m=+51.506711366 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.654810 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.654865 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.654903 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.654931 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.654900 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.655019 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.655139 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.655208 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.655020 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:49.654988366 +0000 UTC m=+51.507010894 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.655242 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.655299 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.655388 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.655396 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:49.655367577 +0000 UTC m=+51.507390055 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.655504 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:49.65546825 +0000 UTC m=+51.507490728 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.655542 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:49.655527232 +0000 UTC m=+51.507549710 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.704878 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.704956 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.704975 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.705002 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.705019 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:33Z","lastTransitionTime":"2025-12-15T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.736213 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.736250 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.736337 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.736405 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.736541 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.736681 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.807962 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.808016 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.808039 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.808068 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.808088 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:33Z","lastTransitionTime":"2025-12-15T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.885458 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4xt6f"] Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.886666 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:33 crc kubenswrapper[4794]: E1215 13:54:33.886909 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.906297 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.910282 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.910344 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.910360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.910385 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.910404 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:33Z","lastTransitionTime":"2025-12-15T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.927416 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.945252 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.958409 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t42rw\" (UniqueName: \"kubernetes.io/projected/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-kube-api-access-t42rw\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.958647 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.972389 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.982575 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" event={"ID":"08abf0e4-50ec-4ee1-a927-e6383b60ab38","Type":"ContainerStarted","Data":"0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3"} Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.989193 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerStarted","Data":"6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7"} Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.989280 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.989416 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:33 crc kubenswrapper[4794]: I1215 13:54:33.993512 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.010132 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.013285 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.013331 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.013349 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.013370 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.013385 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:34Z","lastTransitionTime":"2025-12-15T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.027944 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.048421 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.059793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t42rw\" (UniqueName: \"kubernetes.io/projected/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-kube-api-access-t42rw\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.059852 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:34 crc kubenswrapper[4794]: E1215 13:54:34.060105 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:34 crc kubenswrapper[4794]: E1215 13:54:34.060188 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs podName:1f6d5d8e-1512-4d71-8363-ba6003bf10b6 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:34.560164804 +0000 UTC m=+36.412187282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs") pod "network-metrics-daemon-4xt6f" (UID: "1f6d5d8e-1512-4d71-8363-ba6003bf10b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.069002 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.091727 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.094567 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t42rw\" (UniqueName: \"kubernetes.io/projected/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-kube-api-access-t42rw\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.116499 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.116555 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.116573 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.116626 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.116645 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:34Z","lastTransitionTime":"2025-12-15T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.117745 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.137501 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.160158 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.179396 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.209698 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.220228 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.220290 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.220308 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.220334 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.220351 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:34Z","lastTransitionTime":"2025-12-15T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.227464 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.249064 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.261945 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.262882 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.281729 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.313278 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.322807 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.322870 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.322889 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.322914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.322932 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:34Z","lastTransitionTime":"2025-12-15T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.330158 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.348233 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.367136 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.383605 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.396813 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.409259 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.422234 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.424952 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.424989 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.425004 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.425025 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.425040 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:34Z","lastTransitionTime":"2025-12-15T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.441857 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.465473 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.492281 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.510990 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.528279 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.528308 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.528319 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.528335 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.528347 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:34Z","lastTransitionTime":"2025-12-15T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.529366 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.548782 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.564997 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:34 crc kubenswrapper[4794]: E1215 13:54:34.565203 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:34 crc kubenswrapper[4794]: E1215 13:54:34.565300 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs podName:1f6d5d8e-1512-4d71-8363-ba6003bf10b6 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:35.565267805 +0000 UTC m=+37.417290293 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs") pod "network-metrics-daemon-4xt6f" (UID: "1f6d5d8e-1512-4d71-8363-ba6003bf10b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.566479 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.584678 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.601083 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.620205 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.631524 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.631612 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.631633 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.631657 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.631677 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:34Z","lastTransitionTime":"2025-12-15T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.636518 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.654516 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.670276 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.691246 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.724966 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.735144 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.735199 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.735217 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.735240 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.735258 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:34Z","lastTransitionTime":"2025-12-15T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.754262 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.774389 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.794298 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.807196 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.827931 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.838645 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.838716 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.838741 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.838772 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.838799 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:34Z","lastTransitionTime":"2025-12-15T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.848894 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.881277 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:34Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.941888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.941955 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.941977 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.942008 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.942033 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:34Z","lastTransitionTime":"2025-12-15T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:34 crc kubenswrapper[4794]: I1215 13:54:34.991688 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.045648 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.045713 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.045731 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.045755 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.045773 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:35Z","lastTransitionTime":"2025-12-15T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.148396 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.148822 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.148997 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.149180 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.149377 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:35Z","lastTransitionTime":"2025-12-15T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.253048 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.253113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.253178 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.253208 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.253228 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:35Z","lastTransitionTime":"2025-12-15T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.355879 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.355924 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.355936 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.355952 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.355965 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:35Z","lastTransitionTime":"2025-12-15T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.459027 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.459098 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.459120 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.459148 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.459169 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:35Z","lastTransitionTime":"2025-12-15T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.562715 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.563086 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.563291 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.563490 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.563727 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:35Z","lastTransitionTime":"2025-12-15T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.575624 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:35 crc kubenswrapper[4794]: E1215 13:54:35.575835 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:35 crc kubenswrapper[4794]: E1215 13:54:35.575951 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs podName:1f6d5d8e-1512-4d71-8363-ba6003bf10b6 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:37.57593399 +0000 UTC m=+39.427956438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs") pod "network-metrics-daemon-4xt6f" (UID: "1f6d5d8e-1512-4d71-8363-ba6003bf10b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.666229 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.666273 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.666284 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.666300 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.666311 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:35Z","lastTransitionTime":"2025-12-15T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.736741 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:35 crc kubenswrapper[4794]: E1215 13:54:35.737028 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.736838 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:35 crc kubenswrapper[4794]: E1215 13:54:35.737226 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.736787 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:35 crc kubenswrapper[4794]: E1215 13:54:35.737388 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.736875 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:35 crc kubenswrapper[4794]: E1215 13:54:35.737553 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.768556 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.768627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.768649 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.768670 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.768684 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:35Z","lastTransitionTime":"2025-12-15T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.872264 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.872608 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.872682 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.872774 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.872848 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:35Z","lastTransitionTime":"2025-12-15T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.975320 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.975360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.975374 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.975394 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.975405 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:35Z","lastTransitionTime":"2025-12-15T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.999461 4794 generic.go:334] "Generic (PLEG): container finished" podID="a33db367-8090-4973-a405-f5b4c8b8479a" containerID="6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7" exitCode=0 Dec 15 13:54:35 crc kubenswrapper[4794]: I1215 13:54:35.999534 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerDied","Data":"6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7"} Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.002766 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.002829 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" event={"ID":"08abf0e4-50ec-4ee1-a927-e6383b60ab38","Type":"ContainerStarted","Data":"a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e"} Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.023644 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.038291 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.051504 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.062493 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.077736 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.077770 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.077780 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.077795 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.077805 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:36Z","lastTransitionTime":"2025-12-15T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.081945 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.097971 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.110748 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.131357 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.141911 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.152288 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.162886 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.172430 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.179641 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.179870 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.179940 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.180003 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.180123 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:36Z","lastTransitionTime":"2025-12-15T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.183371 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.193035 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.202662 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.214381 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.226191 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.234953 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.244986 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.253619 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.265935 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.276694 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.282198 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.282234 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.282243 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.282258 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.282268 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:36Z","lastTransitionTime":"2025-12-15T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.291820 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.308437 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.324036 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.338360 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.349128 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.365244 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.376328 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.385000 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.385027 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.385036 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.385048 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.385057 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:36Z","lastTransitionTime":"2025-12-15T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.390072 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.402415 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.420353 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:36Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.488552 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.488914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.489087 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.489272 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.489429 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:36Z","lastTransitionTime":"2025-12-15T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.592631 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.592703 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.592730 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.592763 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.592785 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:36Z","lastTransitionTime":"2025-12-15T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.695809 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.695854 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.695871 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.695893 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.695912 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:36Z","lastTransitionTime":"2025-12-15T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.798740 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.798793 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.798811 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.798833 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.798852 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:36Z","lastTransitionTime":"2025-12-15T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.902201 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.902270 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.902294 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.902325 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:36 crc kubenswrapper[4794]: I1215 13:54:36.902347 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:36Z","lastTransitionTime":"2025-12-15T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.004639 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.004694 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.004716 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.004743 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.004765 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:37Z","lastTransitionTime":"2025-12-15T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.107821 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.107886 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.107909 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.107941 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.107963 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:37Z","lastTransitionTime":"2025-12-15T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.210763 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.210812 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.210828 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.210849 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.210943 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:37Z","lastTransitionTime":"2025-12-15T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.313619 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.314440 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.314471 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.314496 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.314514 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:37Z","lastTransitionTime":"2025-12-15T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.417346 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.417432 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.417449 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.417471 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.417487 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:37Z","lastTransitionTime":"2025-12-15T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.520495 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.520627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.520656 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.520684 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.520704 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:37Z","lastTransitionTime":"2025-12-15T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.596775 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:37 crc kubenswrapper[4794]: E1215 13:54:37.597053 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:37 crc kubenswrapper[4794]: E1215 13:54:37.597197 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs podName:1f6d5d8e-1512-4d71-8363-ba6003bf10b6 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:41.597166675 +0000 UTC m=+43.449189153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs") pod "network-metrics-daemon-4xt6f" (UID: "1f6d5d8e-1512-4d71-8363-ba6003bf10b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.623996 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.624052 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.624070 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.624094 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.624112 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:37Z","lastTransitionTime":"2025-12-15T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.727905 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.727961 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.727980 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.728011 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.728035 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:37Z","lastTransitionTime":"2025-12-15T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.736340 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.736351 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.736455 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.736498 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:37 crc kubenswrapper[4794]: E1215 13:54:37.737226 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:37 crc kubenswrapper[4794]: E1215 13:54:37.737386 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:37 crc kubenswrapper[4794]: E1215 13:54:37.737512 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:37 crc kubenswrapper[4794]: E1215 13:54:37.737717 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.831221 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.831295 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.831318 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.831349 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.831372 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:37Z","lastTransitionTime":"2025-12-15T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.934554 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.934668 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.934691 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.934729 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:37 crc kubenswrapper[4794]: I1215 13:54:37.934770 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:37Z","lastTransitionTime":"2025-12-15T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.037330 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.037385 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.037405 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.037428 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.037446 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:38Z","lastTransitionTime":"2025-12-15T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.141622 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.141749 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.141810 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.141837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.141935 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:38Z","lastTransitionTime":"2025-12-15T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.245189 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.245254 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.245278 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.245304 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.245321 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:38Z","lastTransitionTime":"2025-12-15T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.348814 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.348889 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.348916 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.348945 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.348963 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:38Z","lastTransitionTime":"2025-12-15T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.452222 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.452283 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.452301 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.452324 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.452343 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:38Z","lastTransitionTime":"2025-12-15T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.554964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.555007 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.555017 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.555033 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.555045 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:38Z","lastTransitionTime":"2025-12-15T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.657612 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.657651 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.657662 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.657678 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.657688 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:38Z","lastTransitionTime":"2025-12-15T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.760964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.761003 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.761015 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.761057 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.761070 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:38Z","lastTransitionTime":"2025-12-15T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.762245 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.775892 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.793401 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.807428 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.824823 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.842845 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.862035 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.863195 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.863249 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.863263 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.863281 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.863292 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:38Z","lastTransitionTime":"2025-12-15T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.875856 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.886092 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.898270 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.912361 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.924689 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.935725 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.949255 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.965290 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.965347 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.965363 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.965380 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.965395 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:38Z","lastTransitionTime":"2025-12-15T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.967725 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:38 crc kubenswrapper[4794]: I1215 13:54:38.983328 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.014388 4794 generic.go:334] "Generic (PLEG): container finished" podID="a33db367-8090-4973-a405-f5b4c8b8479a" containerID="441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c" exitCode=0 Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.014434 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerDied","Data":"441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c"} Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.031394 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.056755 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.070805 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.070844 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.070856 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.070873 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.070884 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:39Z","lastTransitionTime":"2025-12-15T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.070941 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.085769 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.100903 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.114753 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.126292 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.147444 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.157979 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.172218 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.174624 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.174646 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.174655 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.174667 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.174675 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:39Z","lastTransitionTime":"2025-12-15T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.197085 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.207424 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.222100 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.231898 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.240512 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.247962 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.276771 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.276806 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.276842 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.276859 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.276871 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:39Z","lastTransitionTime":"2025-12-15T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.379803 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.379854 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.379871 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.379895 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.379913 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:39Z","lastTransitionTime":"2025-12-15T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.482389 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.482433 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.482444 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.482462 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.482474 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:39Z","lastTransitionTime":"2025-12-15T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.585457 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.585517 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.585534 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.585557 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.585574 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:39Z","lastTransitionTime":"2025-12-15T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.689342 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.689868 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.689886 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.689909 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.689927 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:39Z","lastTransitionTime":"2025-12-15T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.736301 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.736348 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:39 crc kubenswrapper[4794]: E1215 13:54:39.736533 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.736681 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.736783 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:39 crc kubenswrapper[4794]: E1215 13:54:39.736965 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:39 crc kubenswrapper[4794]: E1215 13:54:39.737142 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:39 crc kubenswrapper[4794]: E1215 13:54:39.737276 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.794532 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.794624 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.794643 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.794667 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.794685 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:39Z","lastTransitionTime":"2025-12-15T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.897693 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.897734 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.897759 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.897774 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:39 crc kubenswrapper[4794]: I1215 13:54:39.897785 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:39Z","lastTransitionTime":"2025-12-15T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.001230 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.001290 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.001301 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.001323 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.001340 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:40Z","lastTransitionTime":"2025-12-15T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.012887 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.013148 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 13:54:40 crc kubenswrapper[4794]: E1215 13:54:40.013825 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045 is running failed: container process not found" containerID="e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 15 13:54:40 crc kubenswrapper[4794]: E1215 13:54:40.014434 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045 is running failed: container process not found" containerID="e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 15 13:54:40 crc kubenswrapper[4794]: E1215 13:54:40.015144 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045 is running failed: container process not found" containerID="e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 15 13:54:40 crc kubenswrapper[4794]: E1215 13:54:40.015213 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 13:54:40 crc kubenswrapper[4794]: E1215 13:54:40.015668 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045 is running failed: container process not found" containerID="e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 15 13:54:40 crc kubenswrapper[4794]: E1215 13:54:40.016049 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045 is running failed: container process not found" containerID="e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 15 13:54:40 crc kubenswrapper[4794]: E1215 13:54:40.016432 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045 is running failed: container process not found" containerID="e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 15 13:54:40 crc kubenswrapper[4794]: E1215 13:54:40.016524 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.023404 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" event={"ID":"a33db367-8090-4973-a405-f5b4c8b8479a","Type":"ContainerStarted","Data":"d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.026051 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/0.log" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.030415 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045" exitCode=1 Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.030491 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.031461 4794 scope.go:117] "RemoveContainer" containerID="e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.048348 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.073378 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.094972 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.109924 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.109990 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.110005 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.110030 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.110046 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:40Z","lastTransitionTime":"2025-12-15T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.115452 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.132723 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.151470 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.169354 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.193993 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.205684 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.212964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.212991 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.213001 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.213015 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.213025 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:40Z","lastTransitionTime":"2025-12-15T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.221602 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.235423 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.245266 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.258289 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.269223 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.284859 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.295014 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.310803 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"message\\\":\\\"t node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z]\\\\nI1215 13:54:39.190896 6093 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-daemon per-node LB for network=default: []services.LB{}\\\\nI1215 13:54:39.190909 6093 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-daemon template LB for network=default: []services.LB{}\\\\nI1215 13:54:39.190840 6093 services_controller.go:434] Service openshift-kube-apiserver/apiserver retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{apiserver openshift-kube-apiserver 1c35c6c5-2bb9-4633-8ecb-881a1ff8d2fe 7887 0 2025-02-23 05:33:28 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:2787a90499aeabb4cf7acbefa3d43f6c763431fdc60904fdfa1fe74cd04203ee] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.315444 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.315468 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.315477 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.315490 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.315499 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:40Z","lastTransitionTime":"2025-12-15T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.322438 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.335814 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.354510 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.367084 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.384363 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.401854 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.414076 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.418447 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.418488 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.418499 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.418514 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.418524 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:40Z","lastTransitionTime":"2025-12-15T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.430028 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.446072 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.458699 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.470954 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.481934 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.494244 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.505847 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.518108 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:40Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.520298 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.520332 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.520344 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.520362 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.520375 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:40Z","lastTransitionTime":"2025-12-15T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.623212 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.623267 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.623283 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.623300 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.623312 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:40Z","lastTransitionTime":"2025-12-15T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.725056 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.725095 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.725104 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.725117 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.725125 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:40Z","lastTransitionTime":"2025-12-15T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.827603 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.827641 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.827652 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.827671 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.827683 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:40Z","lastTransitionTime":"2025-12-15T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.929679 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.929729 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.929743 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.929763 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:40 crc kubenswrapper[4794]: I1215 13:54:40.929777 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:40Z","lastTransitionTime":"2025-12-15T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.042753 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.042823 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.042844 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.042888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.042959 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:41Z","lastTransitionTime":"2025-12-15T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.049799 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/0.log" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.053269 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0"} Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.074518 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.091284 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.106127 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.122280 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.143797 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.146201 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.146253 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.146270 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.146292 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.146309 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:41Z","lastTransitionTime":"2025-12-15T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.165045 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.183481 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.203819 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.224737 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.246244 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.249184 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.249321 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.249397 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.249429 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.249499 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:41Z","lastTransitionTime":"2025-12-15T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.260082 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.281522 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"message\\\":\\\"t node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z]\\\\nI1215 13:54:39.190896 6093 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-daemon per-node LB for network=default: []services.LB{}\\\\nI1215 13:54:39.190909 6093 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-daemon template LB for network=default: []services.LB{}\\\\nI1215 13:54:39.190840 6093 services_controller.go:434] Service openshift-kube-apiserver/apiserver retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{apiserver openshift-kube-apiserver 1c35c6c5-2bb9-4633-8ecb-881a1ff8d2fe 7887 0 2025-02-23 05:33:28 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:2787a90499aeabb4cf7acbefa3d43f6c763431fdc60904fdfa1fe74cd04203ee] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.291603 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.306834 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.322022 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.333327 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:41Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.352622 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.352685 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.352705 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.352730 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.352749 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:41Z","lastTransitionTime":"2025-12-15T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.455759 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.455834 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.455856 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.455886 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.455908 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:41Z","lastTransitionTime":"2025-12-15T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.559054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.559126 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.559149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.559179 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.559201 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:41Z","lastTransitionTime":"2025-12-15T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.648328 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:41 crc kubenswrapper[4794]: E1215 13:54:41.648505 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:41 crc kubenswrapper[4794]: E1215 13:54:41.648635 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs podName:1f6d5d8e-1512-4d71-8363-ba6003bf10b6 nodeName:}" failed. No retries permitted until 2025-12-15 13:54:49.6485766 +0000 UTC m=+51.500599078 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs") pod "network-metrics-daemon-4xt6f" (UID: "1f6d5d8e-1512-4d71-8363-ba6003bf10b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.661962 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.662020 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.662037 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.662060 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.662078 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:41Z","lastTransitionTime":"2025-12-15T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.736136 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.736187 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.736162 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.736158 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:41 crc kubenswrapper[4794]: E1215 13:54:41.736298 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:41 crc kubenswrapper[4794]: E1215 13:54:41.736441 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:41 crc kubenswrapper[4794]: E1215 13:54:41.736538 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:41 crc kubenswrapper[4794]: E1215 13:54:41.736664 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.765365 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.765430 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.765451 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.765477 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.765496 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:41Z","lastTransitionTime":"2025-12-15T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.868711 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.868796 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.868830 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.868862 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.868883 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:41Z","lastTransitionTime":"2025-12-15T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.971384 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.971465 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.971487 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.971518 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:41 crc kubenswrapper[4794]: I1215 13:54:41.971543 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:41Z","lastTransitionTime":"2025-12-15T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.059438 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/1.log" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.060332 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/0.log" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.065445 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0" exitCode=1 Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.065534 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0"} Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.065688 4794 scope.go:117] "RemoveContainer" containerID="e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.066906 4794 scope.go:117] "RemoveContainer" containerID="94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0" Dec 15 13:54:42 crc kubenswrapper[4794]: E1215 13:54:42.067187 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.074489 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.074537 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.074555 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.074610 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.074627 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.084278 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.103150 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.117460 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.132319 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.150327 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.165160 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.177130 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.177171 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.177187 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.177204 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.177216 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.179354 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.197537 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.213851 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.228802 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.246988 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.277147 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"message\\\":\\\"t node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z]\\\\nI1215 13:54:39.190896 6093 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-daemon per-node LB for network=default: []services.LB{}\\\\nI1215 13:54:39.190909 6093 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-daemon template LB for network=default: []services.LB{}\\\\nI1215 13:54:39.190840 6093 services_controller.go:434] Service openshift-kube-apiserver/apiserver retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{apiserver openshift-kube-apiserver 1c35c6c5-2bb9-4633-8ecb-881a1ff8d2fe 7887 0 2025-02-23 05:33:28 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:2787a90499aeabb4cf7acbefa3d43f6c763431fdc60904fdfa1fe74cd04203ee] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:41Z\\\",\\\"message\\\":\\\"uster-monitoring-operator on namespace openshift-monitoring for network=default : 11.8µs\\\\nI1215 13:54:40.863502 6348 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:40.863290 6348 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-source for network=default\\\\nI1215 13:54:40.863517 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1215 13:54:40.863527 6348 services_controller.go:360] Finished syncing service network-check-source on namespace openshift-network-diagnostics for network=default : 236.907µs\\\\nI1215 13:54:40.863533 6348 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.129429ms\\\\nI1215 13:54:40.863535 6348 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1215 13:54:40.863502 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1215 13:54:40.863623 6348 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.892622ms\\\\nF1215 13:54:40.863646 6348 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.279855 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.279938 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.279960 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.279992 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.280014 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.289854 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.302170 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.316305 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.319795 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.319870 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.319888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.320216 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.320261 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.329525 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: E1215 13:54:42.331754 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.337153 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.337207 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.337226 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.337250 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.337271 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: E1215 13:54:42.352364 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.356093 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.356139 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.356150 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.356171 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.356187 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: E1215 13:54:42.374368 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.379388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.379445 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.379464 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.379491 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.379507 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: E1215 13:54:42.397851 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.402054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.402116 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.402129 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.402149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.402164 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: E1215 13:54:42.416098 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:42Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:42 crc kubenswrapper[4794]: E1215 13:54:42.416252 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.418141 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.418193 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.418209 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.418227 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.418242 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.522090 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.522195 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.522214 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.522252 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.522289 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.625441 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.625760 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.625775 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.625790 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.625801 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.729566 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.729675 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.729694 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.729722 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.729740 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.832979 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.833020 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.833029 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.833043 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.833052 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.936078 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.936134 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.936150 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.936173 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:42 crc kubenswrapper[4794]: I1215 13:54:42.936205 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:42Z","lastTransitionTime":"2025-12-15T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.039783 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.039845 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.039865 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.039890 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.039909 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:43Z","lastTransitionTime":"2025-12-15T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.072844 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/1.log" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.142347 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.142830 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.143098 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.143311 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.143518 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:43Z","lastTransitionTime":"2025-12-15T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.246222 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.246520 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.246686 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.246861 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.246984 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:43Z","lastTransitionTime":"2025-12-15T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.349608 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.349969 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.350145 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.350337 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.350495 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:43Z","lastTransitionTime":"2025-12-15T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.452568 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.452676 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.452700 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.452731 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.452757 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:43Z","lastTransitionTime":"2025-12-15T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.555676 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.555756 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.555793 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.555825 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.555847 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:43Z","lastTransitionTime":"2025-12-15T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.658798 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.658888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.658898 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.658924 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.658937 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:43Z","lastTransitionTime":"2025-12-15T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.736902 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:43 crc kubenswrapper[4794]: E1215 13:54:43.737041 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.737493 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.737577 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.737697 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:43 crc kubenswrapper[4794]: E1215 13:54:43.737610 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:43 crc kubenswrapper[4794]: E1215 13:54:43.737807 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:43 crc kubenswrapper[4794]: E1215 13:54:43.737940 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.761738 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.761799 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.761818 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.761843 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.761861 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:43Z","lastTransitionTime":"2025-12-15T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.864124 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.864190 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.864208 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.864232 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.864249 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:43Z","lastTransitionTime":"2025-12-15T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.966992 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.967069 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.967093 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.967122 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:43 crc kubenswrapper[4794]: I1215 13:54:43.967141 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:43Z","lastTransitionTime":"2025-12-15T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.070060 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.070124 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.070150 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.070180 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.070201 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:44Z","lastTransitionTime":"2025-12-15T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.173529 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.173613 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.173632 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.173656 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.173675 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:44Z","lastTransitionTime":"2025-12-15T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.276454 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.276937 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.277135 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.277317 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.277465 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:44Z","lastTransitionTime":"2025-12-15T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.380865 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.380935 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.380979 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.381010 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.381033 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:44Z","lastTransitionTime":"2025-12-15T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.484754 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.484811 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.484829 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.484853 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.484871 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:44Z","lastTransitionTime":"2025-12-15T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.588818 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.589035 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.589079 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.589116 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.589144 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:44Z","lastTransitionTime":"2025-12-15T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.691565 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.691672 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.691695 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.691728 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.691750 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:44Z","lastTransitionTime":"2025-12-15T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.794858 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.794946 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.794969 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.794994 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.795014 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:44Z","lastTransitionTime":"2025-12-15T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.898243 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.898307 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.898326 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.898349 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:44 crc kubenswrapper[4794]: I1215 13:54:44.898370 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:44Z","lastTransitionTime":"2025-12-15T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.001171 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.001269 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.001297 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.001326 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.001344 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:45Z","lastTransitionTime":"2025-12-15T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.104422 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.104537 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.104574 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.104658 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.104682 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:45Z","lastTransitionTime":"2025-12-15T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.207635 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.207698 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.207710 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.207732 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.207754 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:45Z","lastTransitionTime":"2025-12-15T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.310343 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.310407 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.310429 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.310453 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.310471 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:45Z","lastTransitionTime":"2025-12-15T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.413458 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.413514 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.413527 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.413545 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.413556 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:45Z","lastTransitionTime":"2025-12-15T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.516656 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.516744 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.516767 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.516799 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.516823 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:45Z","lastTransitionTime":"2025-12-15T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.619369 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.619421 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.619432 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.619451 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.619463 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:45Z","lastTransitionTime":"2025-12-15T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.722696 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.722750 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.722759 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.722775 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.722786 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:45Z","lastTransitionTime":"2025-12-15T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.736053 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.736108 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.736138 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.736233 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:45 crc kubenswrapper[4794]: E1215 13:54:45.736233 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:45 crc kubenswrapper[4794]: E1215 13:54:45.736336 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:45 crc kubenswrapper[4794]: E1215 13:54:45.736565 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:45 crc kubenswrapper[4794]: E1215 13:54:45.736775 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.826637 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.826772 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.826797 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.826826 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.826845 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:45Z","lastTransitionTime":"2025-12-15T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.930480 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.930540 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.930563 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.930624 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:45 crc kubenswrapper[4794]: I1215 13:54:45.930654 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:45Z","lastTransitionTime":"2025-12-15T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.034162 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.034223 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.034242 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.034270 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.034292 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:46Z","lastTransitionTime":"2025-12-15T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.136840 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.136964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.136989 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.137018 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.137040 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:46Z","lastTransitionTime":"2025-12-15T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.239562 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.239654 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.239673 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.239697 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.239719 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:46Z","lastTransitionTime":"2025-12-15T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.342734 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.342797 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.342816 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.342836 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.342847 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:46Z","lastTransitionTime":"2025-12-15T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.445602 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.445632 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.445640 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.445653 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.445662 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:46Z","lastTransitionTime":"2025-12-15T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.548664 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.548719 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.548732 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.548749 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.548761 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:46Z","lastTransitionTime":"2025-12-15T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.652113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.652160 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.652175 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.652199 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.652216 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:46Z","lastTransitionTime":"2025-12-15T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.754994 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.755056 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.755076 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.755101 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.755121 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:46Z","lastTransitionTime":"2025-12-15T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.858333 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.858410 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.858440 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.858471 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.858492 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:46Z","lastTransitionTime":"2025-12-15T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.961983 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.962051 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.962065 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.962089 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:46 crc kubenswrapper[4794]: I1215 13:54:46.962103 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:46Z","lastTransitionTime":"2025-12-15T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.065460 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.065504 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.065517 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.065537 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.065552 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:47Z","lastTransitionTime":"2025-12-15T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.169302 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.169355 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.169367 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.169390 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.169403 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:47Z","lastTransitionTime":"2025-12-15T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.272017 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.272083 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.272098 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.272123 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.272137 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:47Z","lastTransitionTime":"2025-12-15T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.375382 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.375449 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.375469 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.375496 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.375515 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:47Z","lastTransitionTime":"2025-12-15T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.479222 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.479300 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.479325 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.479352 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.479375 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:47Z","lastTransitionTime":"2025-12-15T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.583781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.583882 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.583907 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.583942 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.583966 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:47Z","lastTransitionTime":"2025-12-15T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.687148 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.687221 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.687240 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.687267 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.687286 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:47Z","lastTransitionTime":"2025-12-15T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.736394 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.736421 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.736488 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.736488 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:47 crc kubenswrapper[4794]: E1215 13:54:47.736548 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:47 crc kubenswrapper[4794]: E1215 13:54:47.736646 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:47 crc kubenswrapper[4794]: E1215 13:54:47.736758 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:47 crc kubenswrapper[4794]: E1215 13:54:47.736863 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.790574 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.790668 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.790693 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.790739 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.790761 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:47Z","lastTransitionTime":"2025-12-15T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.894131 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.894192 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.894209 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.894232 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.894250 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:47Z","lastTransitionTime":"2025-12-15T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.997527 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.997647 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.997677 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.997705 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:47 crc kubenswrapper[4794]: I1215 13:54:47.997727 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:47Z","lastTransitionTime":"2025-12-15T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.100179 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.100255 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.100279 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.100313 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.100337 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:48Z","lastTransitionTime":"2025-12-15T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.203097 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.203159 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.203180 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.203206 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.203225 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:48Z","lastTransitionTime":"2025-12-15T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.306490 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.306643 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.306666 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.306690 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.306707 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:48Z","lastTransitionTime":"2025-12-15T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.410179 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.410249 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.410269 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.410294 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.410311 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:48Z","lastTransitionTime":"2025-12-15T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.513462 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.513504 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.513525 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.513543 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.513556 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:48Z","lastTransitionTime":"2025-12-15T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.617107 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.617169 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.617180 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.617201 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.617214 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:48Z","lastTransitionTime":"2025-12-15T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.719785 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.719839 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.719853 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.719874 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.719890 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:48Z","lastTransitionTime":"2025-12-15T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.752954 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.769236 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.800457 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"message\\\":\\\"t node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z]\\\\nI1215 13:54:39.190896 6093 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-daemon per-node LB for network=default: []services.LB{}\\\\nI1215 13:54:39.190909 6093 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-daemon template LB for network=default: []services.LB{}\\\\nI1215 13:54:39.190840 6093 services_controller.go:434] Service openshift-kube-apiserver/apiserver retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{apiserver openshift-kube-apiserver 1c35c6c5-2bb9-4633-8ecb-881a1ff8d2fe 7887 0 2025-02-23 05:33:28 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:2787a90499aeabb4cf7acbefa3d43f6c763431fdc60904fdfa1fe74cd04203ee] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:41Z\\\",\\\"message\\\":\\\"uster-monitoring-operator on namespace openshift-monitoring for network=default : 11.8µs\\\\nI1215 13:54:40.863502 6348 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:40.863290 6348 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-source for network=default\\\\nI1215 13:54:40.863517 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1215 13:54:40.863527 6348 services_controller.go:360] Finished syncing service network-check-source on namespace openshift-network-diagnostics for network=default : 236.907µs\\\\nI1215 13:54:40.863533 6348 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.129429ms\\\\nI1215 13:54:40.863535 6348 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1215 13:54:40.863502 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1215 13:54:40.863623 6348 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.892622ms\\\\nF1215 13:54:40.863646 6348 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.815825 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.822918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.822969 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.822988 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.823013 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.823032 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:48Z","lastTransitionTime":"2025-12-15T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.834199 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.861402 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.880029 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.900262 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.914473 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.926549 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.926643 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.926662 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.926688 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.926705 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:48Z","lastTransitionTime":"2025-12-15T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.933549 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.950953 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.971413 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:48 crc kubenswrapper[4794]: I1215 13:54:48.990728 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:48Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.010161 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:49Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.026690 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:49Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.028857 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.028921 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.028940 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.028965 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.028982 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:49Z","lastTransitionTime":"2025-12-15T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.047716 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:49Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.132540 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.132642 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.132660 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.132684 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.132702 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:49Z","lastTransitionTime":"2025-12-15T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.235072 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.235156 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.235183 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.235213 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.235238 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:49Z","lastTransitionTime":"2025-12-15T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.338767 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.338838 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.338860 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.338890 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.338911 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:49Z","lastTransitionTime":"2025-12-15T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.441757 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.441810 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.441827 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.441852 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.441872 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:49Z","lastTransitionTime":"2025-12-15T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.544973 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.545338 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.545620 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.545805 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.545939 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:49Z","lastTransitionTime":"2025-12-15T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.648957 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.649260 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.649654 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.650257 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.650469 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:49Z","lastTransitionTime":"2025-12-15T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.734957 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735138 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:55:21.735099649 +0000 UTC m=+83.587122127 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.735216 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.735290 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.735364 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.735436 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735452 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735485 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735505 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.735504 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735577 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-15 13:55:21.735552811 +0000 UTC m=+83.587575289 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735613 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735661 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735686 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735704 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735762 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-15 13:55:21.735738277 +0000 UTC m=+83.587760815 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735801 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735802 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.735801 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs podName:1f6d5d8e-1512-4d71-8363-ba6003bf10b6 nodeName:}" failed. No retries permitted until 2025-12-15 13:55:05.735781568 +0000 UTC m=+67.587804176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs") pod "network-metrics-daemon-4xt6f" (UID: "1f6d5d8e-1512-4d71-8363-ba6003bf10b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.736044 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.736068 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.736091 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:55:21.735983464 +0000 UTC m=+83.588005942 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.736154 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:55:21.736133608 +0000 UTC m=+83.588156086 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.736184 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.736248 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.736262 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.736397 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.736701 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:49 crc kubenswrapper[4794]: E1215 13:54:49.736782 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.753791 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.753845 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.753861 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.753886 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.753905 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:49Z","lastTransitionTime":"2025-12-15T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.857142 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.857201 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.857214 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.857230 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.857242 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:49Z","lastTransitionTime":"2025-12-15T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.961126 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.961186 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.961202 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.961254 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:49 crc kubenswrapper[4794]: I1215 13:54:49.961272 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:49Z","lastTransitionTime":"2025-12-15T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.064113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.064172 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.064190 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.064214 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.064233 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:50Z","lastTransitionTime":"2025-12-15T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.167176 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.167223 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.167233 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.167246 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.167255 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:50Z","lastTransitionTime":"2025-12-15T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.270645 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.270690 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.270707 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.270729 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.270746 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:50Z","lastTransitionTime":"2025-12-15T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.373841 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.373914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.373935 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.373959 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.373976 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:50Z","lastTransitionTime":"2025-12-15T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.477123 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.477206 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.477233 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.477270 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.477294 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:50Z","lastTransitionTime":"2025-12-15T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.580546 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.580640 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.580663 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.580688 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.580710 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:50Z","lastTransitionTime":"2025-12-15T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.684087 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.684138 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.684152 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.684173 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.684188 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:50Z","lastTransitionTime":"2025-12-15T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.787772 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.787830 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.787846 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.787873 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.787893 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:50Z","lastTransitionTime":"2025-12-15T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.908083 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.908152 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.908173 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.908203 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:50 crc kubenswrapper[4794]: I1215 13:54:50.908226 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:50Z","lastTransitionTime":"2025-12-15T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.011083 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.011124 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.011135 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.011151 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.011162 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:51Z","lastTransitionTime":"2025-12-15T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.113938 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.113986 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.113996 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.114013 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.114024 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:51Z","lastTransitionTime":"2025-12-15T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.219175 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.219269 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.219288 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.219314 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.219332 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:51Z","lastTransitionTime":"2025-12-15T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.322019 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.322082 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.322099 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.322122 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.322138 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:51Z","lastTransitionTime":"2025-12-15T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.425350 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.425416 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.425436 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.425461 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.425480 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:51Z","lastTransitionTime":"2025-12-15T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.528766 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.528835 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.528861 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.528892 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.528918 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:51Z","lastTransitionTime":"2025-12-15T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.632124 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.632198 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.632221 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.632258 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.632340 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:51Z","lastTransitionTime":"2025-12-15T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.736087 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.736187 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.736257 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.736376 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.736470 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:51 crc kubenswrapper[4794]: E1215 13:54:51.736466 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.736492 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.736558 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.736619 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:51Z","lastTransitionTime":"2025-12-15T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:51 crc kubenswrapper[4794]: E1215 13:54:51.736662 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.736790 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:51 crc kubenswrapper[4794]: E1215 13:54:51.736802 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:51 crc kubenswrapper[4794]: E1215 13:54:51.736958 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.839281 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.839347 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.839364 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.839389 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.839407 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:51Z","lastTransitionTime":"2025-12-15T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.942632 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.942684 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.942702 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.942721 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:51 crc kubenswrapper[4794]: I1215 13:54:51.942736 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:51Z","lastTransitionTime":"2025-12-15T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.045758 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.046092 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.046296 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.046550 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.046745 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.150147 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.150229 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.150243 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.150273 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.150289 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.253663 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.253731 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.253766 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.253798 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.253817 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.357441 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.357525 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.357549 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.357629 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.357655 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.428001 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.428071 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.428096 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.428126 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.428150 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: E1215 13:54:52.453270 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:52Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.458388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.458457 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.458476 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.458502 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.458521 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: E1215 13:54:52.477685 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:52Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.483189 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.483257 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.483277 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.483301 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.483318 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: E1215 13:54:52.500213 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:52Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.504896 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.504939 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.504950 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.504967 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.504981 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: E1215 13:54:52.518896 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:52Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.522922 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.522971 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.522989 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.523010 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.523026 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: E1215 13:54:52.539250 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:52Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:52 crc kubenswrapper[4794]: E1215 13:54:52.539423 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.541181 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.541220 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.541236 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.541257 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.541271 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.644511 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.644561 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.644573 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.644621 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.644633 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.747355 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.747402 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.747414 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.747430 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.747444 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.851029 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.851093 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.851113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.851146 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.851169 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.953925 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.953989 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.954015 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.954048 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:52 crc kubenswrapper[4794]: I1215 13:54:52.954072 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:52Z","lastTransitionTime":"2025-12-15T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.057398 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.057441 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.057452 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.057469 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.057480 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:53Z","lastTransitionTime":"2025-12-15T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.160806 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.160837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.160847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.160864 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.160875 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:53Z","lastTransitionTime":"2025-12-15T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.263838 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.263884 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.263900 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.263923 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.263941 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:53Z","lastTransitionTime":"2025-12-15T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.366695 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.366759 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.366782 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.366809 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.366830 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:53Z","lastTransitionTime":"2025-12-15T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.490155 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.490226 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.490248 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.490277 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.490350 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:53Z","lastTransitionTime":"2025-12-15T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.592451 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.592501 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.592515 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.592533 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.592546 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:53Z","lastTransitionTime":"2025-12-15T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.695303 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.695356 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.695373 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.695392 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.695408 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:53Z","lastTransitionTime":"2025-12-15T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.737084 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.737101 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:53 crc kubenswrapper[4794]: E1215 13:54:53.737297 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.737112 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.737123 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:53 crc kubenswrapper[4794]: E1215 13:54:53.737422 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:53 crc kubenswrapper[4794]: E1215 13:54:53.737567 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:53 crc kubenswrapper[4794]: E1215 13:54:53.737716 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.797639 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.797672 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.797682 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.797697 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.797709 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:53Z","lastTransitionTime":"2025-12-15T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.899804 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.899855 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.899872 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.899888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:53 crc kubenswrapper[4794]: I1215 13:54:53.899898 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:53Z","lastTransitionTime":"2025-12-15T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.002878 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.002945 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.002966 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.002989 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.003003 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:54Z","lastTransitionTime":"2025-12-15T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.104990 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.105091 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.105117 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.105149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.105167 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:54Z","lastTransitionTime":"2025-12-15T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.207534 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.207664 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.207698 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.207731 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.207755 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:54Z","lastTransitionTime":"2025-12-15T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.310956 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.311031 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.311053 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.311083 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.311105 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:54Z","lastTransitionTime":"2025-12-15T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.414072 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.414143 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.414161 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.414187 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.414206 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:54Z","lastTransitionTime":"2025-12-15T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.457472 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.469728 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.481338 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.496842 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.517130 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.517215 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.517242 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.517273 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.517296 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:54Z","lastTransitionTime":"2025-12-15T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.528227 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2b23e02218c5d4efe59ffd540b4dad859dbe9e39c9d44c85e99445d21e8b045\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"message\\\":\\\"t node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:39Z is after 2025-08-24T17:21:41Z]\\\\nI1215 13:54:39.190896 6093 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-daemon per-node LB for network=default: []services.LB{}\\\\nI1215 13:54:39.190909 6093 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-daemon template LB for network=default: []services.LB{}\\\\nI1215 13:54:39.190840 6093 services_controller.go:434] Service openshift-kube-apiserver/apiserver retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{apiserver openshift-kube-apiserver 1c35c6c5-2bb9-4633-8ecb-881a1ff8d2fe 7887 0 2025-02-23 05:33:28 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[operator.openshift.io/spec-hash:2787a90499aeabb4cf7acbefa3d43f6c763431fdc60904fdfa1fe74cd04203ee] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 6443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:41Z\\\",\\\"message\\\":\\\"uster-monitoring-operator on namespace openshift-monitoring for network=default : 11.8µs\\\\nI1215 13:54:40.863502 6348 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:40.863290 6348 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-source for network=default\\\\nI1215 13:54:40.863517 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1215 13:54:40.863527 6348 services_controller.go:360] Finished syncing service network-check-source on namespace openshift-network-diagnostics for network=default : 236.907µs\\\\nI1215 13:54:40.863533 6348 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.129429ms\\\\nI1215 13:54:40.863535 6348 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1215 13:54:40.863502 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1215 13:54:40.863623 6348 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.892622ms\\\\nF1215 13:54:40.863646 6348 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.542570 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.559060 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.573435 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.587317 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.599443 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.611070 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.620418 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.620478 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.620523 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.620549 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.620569 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:54Z","lastTransitionTime":"2025-12-15T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.626811 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.645394 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.664701 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.682126 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.700410 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.722307 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.723035 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.723074 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.723086 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.723103 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.723115 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:54Z","lastTransitionTime":"2025-12-15T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.744658 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:54Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.826155 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.826273 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.826316 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.826353 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.827358 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:54Z","lastTransitionTime":"2025-12-15T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.936126 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.936219 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.936238 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.936262 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:54 crc kubenswrapper[4794]: I1215 13:54:54.936280 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:54Z","lastTransitionTime":"2025-12-15T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.038331 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.038373 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.038384 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.038400 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.038413 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:55Z","lastTransitionTime":"2025-12-15T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.140484 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.140537 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.140550 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.140567 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.140605 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:55Z","lastTransitionTime":"2025-12-15T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.243180 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.243252 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.243276 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.243311 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.243334 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:55Z","lastTransitionTime":"2025-12-15T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.346511 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.346816 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.346838 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.346863 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.346885 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:55Z","lastTransitionTime":"2025-12-15T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.449344 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.449399 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.449418 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.449446 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.449480 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:55Z","lastTransitionTime":"2025-12-15T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.553269 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.553337 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.553363 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.553394 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.553416 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:55Z","lastTransitionTime":"2025-12-15T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.656451 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.656513 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.656530 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.656555 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.656572 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:55Z","lastTransitionTime":"2025-12-15T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.736343 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.736366 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:55 crc kubenswrapper[4794]: E1215 13:54:55.736543 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.736651 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:55 crc kubenswrapper[4794]: E1215 13:54:55.736904 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:55 crc kubenswrapper[4794]: E1215 13:54:55.737028 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.737031 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:55 crc kubenswrapper[4794]: E1215 13:54:55.737184 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.739319 4794 scope.go:117] "RemoveContainer" containerID="94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.760075 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.760182 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.760203 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.760228 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.760250 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:55Z","lastTransitionTime":"2025-12-15T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.762862 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.790910 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.811208 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.831225 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.846908 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.862806 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.862927 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.862999 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.863034 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.863091 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:55Z","lastTransitionTime":"2025-12-15T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.868248 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.887779 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.905977 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.930930 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:41Z\\\",\\\"message\\\":\\\"uster-monitoring-operator on namespace openshift-monitoring for network=default : 11.8µs\\\\nI1215 13:54:40.863502 6348 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:40.863290 6348 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-source for network=default\\\\nI1215 13:54:40.863517 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1215 13:54:40.863527 6348 services_controller.go:360] Finished syncing service network-check-source on namespace openshift-network-diagnostics for network=default : 236.907µs\\\\nI1215 13:54:40.863533 6348 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.129429ms\\\\nI1215 13:54:40.863535 6348 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1215 13:54:40.863502 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1215 13:54:40.863623 6348 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.892622ms\\\\nF1215 13:54:40.863646 6348 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.953287 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.966832 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.966873 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.966889 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.966912 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.966929 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:55Z","lastTransitionTime":"2025-12-15T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.973083 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:55 crc kubenswrapper[4794]: I1215 13:54:55.993111 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:55Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.006327 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.019696 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.036141 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.051075 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.065726 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.069288 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.069327 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.069338 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.069351 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.069360 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:56Z","lastTransitionTime":"2025-12-15T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.129372 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/1.log" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.132128 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500"} Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.132763 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.147766 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.161210 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.172008 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.172046 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.172054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.172069 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.172078 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:56Z","lastTransitionTime":"2025-12-15T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.194417 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:41Z\\\",\\\"message\\\":\\\"uster-monitoring-operator on namespace openshift-monitoring for network=default : 11.8µs\\\\nI1215 13:54:40.863502 6348 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:40.863290 6348 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-source for network=default\\\\nI1215 13:54:40.863517 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1215 13:54:40.863527 6348 services_controller.go:360] Finished syncing service network-check-source on namespace openshift-network-diagnostics for network=default : 236.907µs\\\\nI1215 13:54:40.863533 6348 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.129429ms\\\\nI1215 13:54:40.863535 6348 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1215 13:54:40.863502 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1215 13:54:40.863623 6348 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.892622ms\\\\nF1215 13:54:40.863646 6348 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.210260 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.236601 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.253527 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.262631 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.273836 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.273880 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.273893 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.273910 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.273930 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:56Z","lastTransitionTime":"2025-12-15T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.274751 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.282493 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.292034 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.301137 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.311998 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.322279 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.332115 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.342572 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.352710 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.366560 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:56Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.376298 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.376356 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.376374 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.376399 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.376419 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:56Z","lastTransitionTime":"2025-12-15T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.478166 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.478197 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.478207 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.478220 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.478229 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:56Z","lastTransitionTime":"2025-12-15T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.580788 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.580826 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.580857 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.580877 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.580890 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:56Z","lastTransitionTime":"2025-12-15T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.683321 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.683405 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.683427 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.683458 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.683480 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:56Z","lastTransitionTime":"2025-12-15T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.787378 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.787509 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.787531 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.787562 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.787615 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:56Z","lastTransitionTime":"2025-12-15T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.890516 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.890563 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.890597 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.890618 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.890633 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:56Z","lastTransitionTime":"2025-12-15T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.993927 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.994002 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.994026 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.994056 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:56 crc kubenswrapper[4794]: I1215 13:54:56.994080 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:56Z","lastTransitionTime":"2025-12-15T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.098071 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.098145 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.098166 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.098194 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.098215 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:57Z","lastTransitionTime":"2025-12-15T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.137279 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/2.log" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.137957 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/1.log" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.141087 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500" exitCode=1 Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.141122 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500"} Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.141180 4794 scope.go:117] "RemoveContainer" containerID="94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.141887 4794 scope.go:117] "RemoveContainer" containerID="e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500" Dec 15 13:54:57 crc kubenswrapper[4794]: E1215 13:54:57.142068 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.162280 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.180207 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.198206 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.201206 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.201255 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.201271 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.201293 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.201307 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:57Z","lastTransitionTime":"2025-12-15T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.212098 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.233168 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.249681 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.267114 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.284941 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.304811 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.304869 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.304889 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.304917 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.304936 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:57Z","lastTransitionTime":"2025-12-15T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.309742 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.332127 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.352210 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.367611 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.382365 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.407928 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.407992 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.408009 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.408034 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.408054 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:57Z","lastTransitionTime":"2025-12-15T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.413167 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94da3a28a0b2a689dd0d81db2ee159544fa4d20f35ce50deacf4813ff5af08d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:41Z\\\",\\\"message\\\":\\\"uster-monitoring-operator on namespace openshift-monitoring for network=default : 11.8µs\\\\nI1215 13:54:40.863502 6348 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:40.863290 6348 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-source for network=default\\\\nI1215 13:54:40.863517 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1215 13:54:40.863527 6348 services_controller.go:360] Finished syncing service network-check-source on namespace openshift-network-diagnostics for network=default : 236.907µs\\\\nI1215 13:54:40.863533 6348 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.129429ms\\\\nI1215 13:54:40.863535 6348 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1215 13:54:40.863502 6348 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1215 13:54:40.863623 6348 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.892622ms\\\\nF1215 13:54:40.863646 6348 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:57Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.617815 6511 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.618129 6511 obj_retry.go:551] Creating *factory.egressNode crc took: 1.830702ms\\\\nI1215 13:54:56.618151 6511 factory.go:1336] Added *v1.Node event handler 7\\\\nI1215 13:54:56.618173 6511 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1215 13:54:56.618436 6511 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1215 13:54:56.618506 6511 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1215 13:54:56.618537 6511 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:56.618559 6511 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:54:56.618661 6511 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.430164 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.451954 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.471338 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:57Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.510566 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.510648 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.510671 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.510701 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.510723 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:57Z","lastTransitionTime":"2025-12-15T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.614360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.614437 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.614457 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.614486 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.614510 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:57Z","lastTransitionTime":"2025-12-15T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.719939 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.720023 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.720042 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.720069 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.720091 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:57Z","lastTransitionTime":"2025-12-15T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.736356 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:57 crc kubenswrapper[4794]: E1215 13:54:57.736526 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.736802 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.736906 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:57 crc kubenswrapper[4794]: E1215 13:54:57.737117 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.737149 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:57 crc kubenswrapper[4794]: E1215 13:54:57.737270 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:57 crc kubenswrapper[4794]: E1215 13:54:57.737477 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.824213 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.824277 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.824300 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.824329 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.824353 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:57Z","lastTransitionTime":"2025-12-15T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.927955 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.928024 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.928043 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.928070 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:57 crc kubenswrapper[4794]: I1215 13:54:57.928090 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:57Z","lastTransitionTime":"2025-12-15T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.031631 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.031697 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.031720 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.031751 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.031773 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:58Z","lastTransitionTime":"2025-12-15T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.135110 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.135182 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.135206 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.135236 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.135259 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:58Z","lastTransitionTime":"2025-12-15T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.148459 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/2.log" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.153753 4794 scope.go:117] "RemoveContainer" containerID="e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500" Dec 15 13:54:58 crc kubenswrapper[4794]: E1215 13:54:58.154004 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.181219 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.201731 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.219459 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.232527 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.237541 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.237604 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.237616 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.237632 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.237645 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:58Z","lastTransitionTime":"2025-12-15T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.246819 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.262619 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.277933 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.290977 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.313362 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:57Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.617815 6511 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.618129 6511 obj_retry.go:551] Creating *factory.egressNode crc took: 1.830702ms\\\\nI1215 13:54:56.618151 6511 factory.go:1336] Added *v1.Node event handler 7\\\\nI1215 13:54:56.618173 6511 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1215 13:54:56.618436 6511 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1215 13:54:56.618506 6511 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1215 13:54:56.618537 6511 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:56.618559 6511 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:54:56.618661 6511 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.326892 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.340231 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.340491 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.340563 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.340661 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.340739 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:58Z","lastTransitionTime":"2025-12-15T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.341550 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.355460 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.371677 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.391040 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.405159 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.420859 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.436550 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.444133 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.444192 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.444217 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.444249 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.444271 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:58Z","lastTransitionTime":"2025-12-15T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.547149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.547210 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.547227 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.547263 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.547280 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:58Z","lastTransitionTime":"2025-12-15T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.649822 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.649860 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.649872 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.649888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.649900 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:58Z","lastTransitionTime":"2025-12-15T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.751788 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.751839 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.751851 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.751867 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.751879 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:58Z","lastTransitionTime":"2025-12-15T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.754361 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.768570 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.787068 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.802426 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.816315 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.829742 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.841904 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.855640 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.855672 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.855680 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.855693 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.855701 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:58Z","lastTransitionTime":"2025-12-15T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.868264 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.888421 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.903243 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.918125 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.935209 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.954931 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.958719 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.958781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.958794 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.958812 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.958830 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:58Z","lastTransitionTime":"2025-12-15T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.971734 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:58 crc kubenswrapper[4794]: I1215 13:54:58.988771 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:58Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.028089 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:57Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.617815 6511 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.618129 6511 obj_retry.go:551] Creating *factory.egressNode crc took: 1.830702ms\\\\nI1215 13:54:56.618151 6511 factory.go:1336] Added *v1.Node event handler 7\\\\nI1215 13:54:56.618173 6511 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1215 13:54:56.618436 6511 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1215 13:54:56.618506 6511 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1215 13:54:56.618537 6511 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:56.618559 6511 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:54:56.618661 6511 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:59Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.039839 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:54:59Z is after 2025-08-24T17:21:41Z" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.061073 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.061110 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.061121 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.061135 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.061173 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:59Z","lastTransitionTime":"2025-12-15T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.163443 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.163493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.163504 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.163520 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.163532 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:59Z","lastTransitionTime":"2025-12-15T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.266184 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.266213 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.266221 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.266234 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.266243 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:59Z","lastTransitionTime":"2025-12-15T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.424815 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.424883 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.424899 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.424919 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.424933 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:59Z","lastTransitionTime":"2025-12-15T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.527904 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.527964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.527981 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.528007 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.528025 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:59Z","lastTransitionTime":"2025-12-15T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.631179 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.631232 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.631256 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.631281 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.631305 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:59Z","lastTransitionTime":"2025-12-15T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.734846 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.734908 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.734932 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.734961 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.734985 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:59Z","lastTransitionTime":"2025-12-15T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.736439 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.736452 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.736530 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.736544 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:54:59 crc kubenswrapper[4794]: E1215 13:54:59.736750 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:54:59 crc kubenswrapper[4794]: E1215 13:54:59.736893 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:54:59 crc kubenswrapper[4794]: E1215 13:54:59.737050 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:54:59 crc kubenswrapper[4794]: E1215 13:54:59.737166 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.837981 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.838051 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.838073 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.838095 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.838112 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:59Z","lastTransitionTime":"2025-12-15T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.940835 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.940898 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.940920 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.940948 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:54:59 crc kubenswrapper[4794]: I1215 13:54:59.940972 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:54:59Z","lastTransitionTime":"2025-12-15T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.043985 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.044658 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.044681 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.044700 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.044716 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:00Z","lastTransitionTime":"2025-12-15T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.148167 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.148251 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.148275 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.148314 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.148337 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:00Z","lastTransitionTime":"2025-12-15T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.250805 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.250866 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.250881 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.250898 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.250939 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:00Z","lastTransitionTime":"2025-12-15T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.353692 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.353791 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.353811 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.353879 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.353895 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:00Z","lastTransitionTime":"2025-12-15T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.457116 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.457174 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.457192 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.457216 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.457233 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:00Z","lastTransitionTime":"2025-12-15T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.560545 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.560622 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.560633 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.560650 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.560662 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:00Z","lastTransitionTime":"2025-12-15T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.663537 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.663621 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.663642 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.663664 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.663681 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:00Z","lastTransitionTime":"2025-12-15T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.765757 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.765824 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.765845 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.765880 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.765904 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:00Z","lastTransitionTime":"2025-12-15T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.868387 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.868448 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.868465 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.868490 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.868508 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:00Z","lastTransitionTime":"2025-12-15T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.971772 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.971808 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.971818 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.971833 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:00 crc kubenswrapper[4794]: I1215 13:55:00.971842 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:00Z","lastTransitionTime":"2025-12-15T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.073948 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.074004 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.074022 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.074045 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.074063 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:01Z","lastTransitionTime":"2025-12-15T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.176815 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.176853 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.176862 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.176876 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.176884 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:01Z","lastTransitionTime":"2025-12-15T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.279117 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.279155 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.279165 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.279178 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.279187 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:01Z","lastTransitionTime":"2025-12-15T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.381411 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.381697 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.381807 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.381911 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.381990 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:01Z","lastTransitionTime":"2025-12-15T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.484871 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.484938 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.484961 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.484992 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.485014 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:01Z","lastTransitionTime":"2025-12-15T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.587934 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.588251 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.588326 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.588394 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.588462 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:01Z","lastTransitionTime":"2025-12-15T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.691704 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.691774 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.691795 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.691823 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.691845 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:01Z","lastTransitionTime":"2025-12-15T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.736596 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:01 crc kubenswrapper[4794]: E1215 13:55:01.736743 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.736565 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.736598 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:01 crc kubenswrapper[4794]: E1215 13:55:01.736828 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.736565 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:01 crc kubenswrapper[4794]: E1215 13:55:01.736978 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:01 crc kubenswrapper[4794]: E1215 13:55:01.737122 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.794517 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.794572 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.794627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.794656 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.794679 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:01Z","lastTransitionTime":"2025-12-15T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.897873 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.897924 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.897940 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.897963 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:01 crc kubenswrapper[4794]: I1215 13:55:01.897979 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:01Z","lastTransitionTime":"2025-12-15T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.001456 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.001515 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.001531 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.001553 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.001570 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.104305 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.104354 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.104371 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.104392 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.104410 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.206048 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.206089 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.206104 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.206122 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.206135 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.308502 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.308546 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.308563 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.308622 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.308639 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.411620 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.411674 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.411695 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.411721 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.411741 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.514304 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.514364 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.514381 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.514404 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.514430 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.616611 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.616637 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.616646 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.616659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.616668 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.719143 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.719178 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.719189 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.719204 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.719217 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.822743 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.822789 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.822810 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.822826 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.822839 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.824153 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.824187 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.824200 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.824220 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.824245 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: E1215 13:55:02.838213 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:02Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.841392 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.841423 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.841434 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.841449 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.841460 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: E1215 13:55:02.852367 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:02Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.856207 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.856235 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.856249 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.856269 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.856282 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: E1215 13:55:02.867607 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:02Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.870419 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.870445 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.870456 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.870471 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.870482 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: E1215 13:55:02.881971 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:02Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.885332 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.885364 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.885375 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.885388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.885398 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:02 crc kubenswrapper[4794]: E1215 13:55:02.897018 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:02Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:02 crc kubenswrapper[4794]: E1215 13:55:02.897234 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.925536 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.925604 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.925616 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.925631 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:02 crc kubenswrapper[4794]: I1215 13:55:02.925641 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:02Z","lastTransitionTime":"2025-12-15T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.028372 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.028485 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.028505 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.028521 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.028533 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:03Z","lastTransitionTime":"2025-12-15T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.131862 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.131934 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.131958 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.131985 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.132007 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:03Z","lastTransitionTime":"2025-12-15T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.235227 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.235264 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.235275 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.235290 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.235302 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:03Z","lastTransitionTime":"2025-12-15T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.339055 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.339122 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.339149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.339176 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.339197 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:03Z","lastTransitionTime":"2025-12-15T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.441749 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.441789 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.441800 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.441814 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.441824 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:03Z","lastTransitionTime":"2025-12-15T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.547751 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.547816 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.547840 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.547868 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.547889 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:03Z","lastTransitionTime":"2025-12-15T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.650740 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.650780 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.650792 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.650807 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.650815 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:03Z","lastTransitionTime":"2025-12-15T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.737161 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:03 crc kubenswrapper[4794]: E1215 13:55:03.737349 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.737747 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.737797 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.737759 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:03 crc kubenswrapper[4794]: E1215 13:55:03.737870 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:03 crc kubenswrapper[4794]: E1215 13:55:03.738288 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:03 crc kubenswrapper[4794]: E1215 13:55:03.738497 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.753693 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.753756 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.753774 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.753801 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.753817 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:03Z","lastTransitionTime":"2025-12-15T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.855892 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.855946 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.855964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.855988 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.856005 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:03Z","lastTransitionTime":"2025-12-15T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.958680 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.958720 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.958729 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.958742 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:03 crc kubenswrapper[4794]: I1215 13:55:03.958754 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:03Z","lastTransitionTime":"2025-12-15T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.060776 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.060813 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.060824 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.060839 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.060849 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:04Z","lastTransitionTime":"2025-12-15T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.163307 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.163361 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.163373 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.163393 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.163405 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:04Z","lastTransitionTime":"2025-12-15T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.265960 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.266011 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.266022 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.266035 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.266047 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:04Z","lastTransitionTime":"2025-12-15T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.368930 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.368981 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.368992 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.369009 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.369021 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:04Z","lastTransitionTime":"2025-12-15T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.471164 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.471244 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.471284 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.471317 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.471340 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:04Z","lastTransitionTime":"2025-12-15T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.576365 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.576484 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.576507 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.576534 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.576558 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:04Z","lastTransitionTime":"2025-12-15T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.679009 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.679066 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.679079 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.679094 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.679109 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:04Z","lastTransitionTime":"2025-12-15T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.781925 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.781965 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.781975 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.781993 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.782006 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:04Z","lastTransitionTime":"2025-12-15T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.884964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.885016 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.885032 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.885057 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.885074 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:04Z","lastTransitionTime":"2025-12-15T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.987624 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.987678 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.987691 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.987711 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:04 crc kubenswrapper[4794]: I1215 13:55:04.987724 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:04Z","lastTransitionTime":"2025-12-15T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.090388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.090484 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.090500 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.090518 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.090530 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:05Z","lastTransitionTime":"2025-12-15T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.192383 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.192415 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.192423 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.192435 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.192444 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:05Z","lastTransitionTime":"2025-12-15T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.294864 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.294917 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.294928 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.294946 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.294960 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:05Z","lastTransitionTime":"2025-12-15T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.396659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.396723 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.396734 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.396747 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.396755 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:05Z","lastTransitionTime":"2025-12-15T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.499191 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.499242 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.499254 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.499277 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.499289 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:05Z","lastTransitionTime":"2025-12-15T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.602500 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.602554 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.602563 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.602607 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.602618 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:05Z","lastTransitionTime":"2025-12-15T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.705038 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.705084 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.705098 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.705118 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.705130 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:05Z","lastTransitionTime":"2025-12-15T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.736838 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.736881 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:05 crc kubenswrapper[4794]: E1215 13:55:05.736999 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.737033 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.737121 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:05 crc kubenswrapper[4794]: E1215 13:55:05.737308 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:05 crc kubenswrapper[4794]: E1215 13:55:05.737534 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:05 crc kubenswrapper[4794]: E1215 13:55:05.737649 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.750060 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.808616 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.808746 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.808783 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.808815 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.808835 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:05Z","lastTransitionTime":"2025-12-15T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.821395 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:05 crc kubenswrapper[4794]: E1215 13:55:05.821629 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:55:05 crc kubenswrapper[4794]: E1215 13:55:05.821769 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs podName:1f6d5d8e-1512-4d71-8363-ba6003bf10b6 nodeName:}" failed. No retries permitted until 2025-12-15 13:55:37.821739148 +0000 UTC m=+99.673761626 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs") pod "network-metrics-daemon-4xt6f" (UID: "1f6d5d8e-1512-4d71-8363-ba6003bf10b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.911397 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.911444 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.911457 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.911473 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:05 crc kubenswrapper[4794]: I1215 13:55:05.911485 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:05Z","lastTransitionTime":"2025-12-15T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.014191 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.014262 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.014285 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.014313 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.014331 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:06Z","lastTransitionTime":"2025-12-15T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.116669 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.116729 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.116747 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.116773 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.116894 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:06Z","lastTransitionTime":"2025-12-15T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.220482 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.220564 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.220622 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.220653 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.220675 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:06Z","lastTransitionTime":"2025-12-15T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.322679 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.322737 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.322756 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.322780 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.322799 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:06Z","lastTransitionTime":"2025-12-15T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.424794 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.424851 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.424867 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.424889 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.424906 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:06Z","lastTransitionTime":"2025-12-15T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.527479 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.527521 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.527533 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.527553 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.527566 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:06Z","lastTransitionTime":"2025-12-15T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.629619 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.629683 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.629700 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.629723 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.629742 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:06Z","lastTransitionTime":"2025-12-15T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.732252 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.732294 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.732304 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.732320 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.732329 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:06Z","lastTransitionTime":"2025-12-15T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.834199 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.834261 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.834278 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.834303 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.834321 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:06Z","lastTransitionTime":"2025-12-15T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.937169 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.937211 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.937219 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.937234 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:06 crc kubenswrapper[4794]: I1215 13:55:06.937243 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:06Z","lastTransitionTime":"2025-12-15T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.039197 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.039237 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.039254 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.039274 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.039290 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:07Z","lastTransitionTime":"2025-12-15T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.141342 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.141373 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.141384 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.141402 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.141412 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:07Z","lastTransitionTime":"2025-12-15T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.243626 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.243718 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.243735 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.243759 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.243813 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:07Z","lastTransitionTime":"2025-12-15T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.345800 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.345835 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.345847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.345863 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.345876 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:07Z","lastTransitionTime":"2025-12-15T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.449195 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.449247 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.449261 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.449278 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.449289 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:07Z","lastTransitionTime":"2025-12-15T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.552363 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.552425 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.552446 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.552468 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.552485 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:07Z","lastTransitionTime":"2025-12-15T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.654535 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.654614 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.654632 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.654655 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.654673 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:07Z","lastTransitionTime":"2025-12-15T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.736657 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.736680 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.736762 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:07 crc kubenswrapper[4794]: E1215 13:55:07.736819 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.736844 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:07 crc kubenswrapper[4794]: E1215 13:55:07.736948 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:07 crc kubenswrapper[4794]: E1215 13:55:07.737088 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:07 crc kubenswrapper[4794]: E1215 13:55:07.737197 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.757733 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.757787 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.757800 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.757820 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.757832 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:07Z","lastTransitionTime":"2025-12-15T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.860417 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.860460 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.860471 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.860488 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.860496 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:07Z","lastTransitionTime":"2025-12-15T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.962914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.963129 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.963164 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.963190 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:07 crc kubenswrapper[4794]: I1215 13:55:07.963206 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:07Z","lastTransitionTime":"2025-12-15T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.065865 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.065922 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.065943 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.065967 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.065985 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:08Z","lastTransitionTime":"2025-12-15T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.168600 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.168631 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.168644 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.168659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.168668 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:08Z","lastTransitionTime":"2025-12-15T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.271366 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.271406 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.271414 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.271428 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.271437 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:08Z","lastTransitionTime":"2025-12-15T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.374225 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.374270 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.374282 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.374297 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.374306 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:08Z","lastTransitionTime":"2025-12-15T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.476669 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.476720 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.476733 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.476750 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.476761 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:08Z","lastTransitionTime":"2025-12-15T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.579087 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.579145 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.579164 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.579185 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.579203 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:08Z","lastTransitionTime":"2025-12-15T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.683078 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.683130 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.683142 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.683159 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.683171 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:08Z","lastTransitionTime":"2025-12-15T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.753263 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.767309 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.785486 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.785852 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.785879 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.785890 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.785906 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.785942 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:08Z","lastTransitionTime":"2025-12-15T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.797671 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.812167 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.826352 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.838328 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.851971 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.865667 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.881370 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.888112 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.888147 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.888159 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.888177 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.888188 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:08Z","lastTransitionTime":"2025-12-15T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.891232 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1252f7a3-5ea8-4df8-9070-31dcdea637c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d94910a84079baf96b67a0d8d90c33cb2d524a83951541f58c2b2c01aba5d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.902428 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.913439 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.931785 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:57Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.617815 6511 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.618129 6511 obj_retry.go:551] Creating *factory.egressNode crc took: 1.830702ms\\\\nI1215 13:54:56.618151 6511 factory.go:1336] Added *v1.Node event handler 7\\\\nI1215 13:54:56.618173 6511 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1215 13:54:56.618436 6511 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1215 13:54:56.618506 6511 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1215 13:54:56.618537 6511 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:56.618559 6511 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:54:56.618661 6511 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.944370 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.954606 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.965110 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.976401 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:08Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.989880 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.989929 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.989940 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.989957 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:08 crc kubenswrapper[4794]: I1215 13:55:08.989969 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:08Z","lastTransitionTime":"2025-12-15T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.092524 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.092556 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.092564 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.092576 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.092605 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:09Z","lastTransitionTime":"2025-12-15T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.185999 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t9nm7_0bc89ecc-eb8e-4926-bbb7-14c90f449e00/kube-multus/0.log" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.186107 4794 generic.go:334] "Generic (PLEG): container finished" podID="0bc89ecc-eb8e-4926-bbb7-14c90f449e00" containerID="babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95" exitCode=1 Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.186174 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9nm7" event={"ID":"0bc89ecc-eb8e-4926-bbb7-14c90f449e00","Type":"ContainerDied","Data":"babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95"} Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.187047 4794 scope.go:117] "RemoveContainer" containerID="babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.195074 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.195229 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.195335 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.195431 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.195519 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:09Z","lastTransitionTime":"2025-12-15T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.208491 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.226493 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.241765 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.251410 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.263862 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.275819 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.285421 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.298063 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.298095 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.298107 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.298123 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.298135 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:09Z","lastTransitionTime":"2025-12-15T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.300501 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.314023 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.330113 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.342421 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1252f7a3-5ea8-4df8-9070-31dcdea637c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d94910a84079baf96b67a0d8d90c33cb2d524a83951541f58c2b2c01aba5d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.355749 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.368176 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.385675 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:57Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.617815 6511 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.618129 6511 obj_retry.go:551] Creating *factory.egressNode crc took: 1.830702ms\\\\nI1215 13:54:56.618151 6511 factory.go:1336] Added *v1.Node event handler 7\\\\nI1215 13:54:56.618173 6511 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1215 13:54:56.618436 6511 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1215 13:54:56.618506 6511 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1215 13:54:56.618537 6511 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:56.618559 6511 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:54:56.618661 6511 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.393931 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.400186 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.400229 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.400242 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.400258 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.400270 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:09Z","lastTransitionTime":"2025-12-15T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.404199 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.414006 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:08Z\\\",\\\"message\\\":\\\"2025-12-15T13:54:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea\\\\n2025-12-15T13:54:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea to /host/opt/cni/bin/\\\\n2025-12-15T13:54:23Z [verbose] multus-daemon started\\\\n2025-12-15T13:54:23Z [verbose] Readiness Indicator file check\\\\n2025-12-15T13:55:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.423954 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:09Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.502435 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.502487 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.502503 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.502526 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.502542 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:09Z","lastTransitionTime":"2025-12-15T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.604814 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.604842 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.604851 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.604864 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.604873 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:09Z","lastTransitionTime":"2025-12-15T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.707313 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.707366 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.707378 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.707396 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.707408 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:09Z","lastTransitionTime":"2025-12-15T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.736287 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.736339 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:09 crc kubenswrapper[4794]: E1215 13:55:09.736427 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.736449 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.736552 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:09 crc kubenswrapper[4794]: E1215 13:55:09.736747 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:09 crc kubenswrapper[4794]: E1215 13:55:09.736927 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:09 crc kubenswrapper[4794]: E1215 13:55:09.737042 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.809407 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.809463 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.809481 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.809507 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.809524 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:09Z","lastTransitionTime":"2025-12-15T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.912395 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.912440 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.912451 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.912470 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:09 crc kubenswrapper[4794]: I1215 13:55:09.912482 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:09Z","lastTransitionTime":"2025-12-15T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.018799 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.018855 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.018867 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.018884 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.018896 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:10Z","lastTransitionTime":"2025-12-15T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.121255 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.121302 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.121312 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.121325 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.121335 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:10Z","lastTransitionTime":"2025-12-15T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.191101 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t9nm7_0bc89ecc-eb8e-4926-bbb7-14c90f449e00/kube-multus/0.log" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.191176 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9nm7" event={"ID":"0bc89ecc-eb8e-4926-bbb7-14c90f449e00","Type":"ContainerStarted","Data":"38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171"} Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.211674 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.223047 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.223079 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.223089 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.223105 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.223115 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:10Z","lastTransitionTime":"2025-12-15T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.226742 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.236838 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.249893 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.266045 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.277145 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.291665 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.304965 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.316000 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.325647 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.325827 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.325845 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.325868 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.325884 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:10Z","lastTransitionTime":"2025-12-15T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.334847 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.348206 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1252f7a3-5ea8-4df8-9070-31dcdea637c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d94910a84079baf96b67a0d8d90c33cb2d524a83951541f58c2b2c01aba5d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.362775 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.380325 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.409987 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:57Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.617815 6511 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.618129 6511 obj_retry.go:551] Creating *factory.egressNode crc took: 1.830702ms\\\\nI1215 13:54:56.618151 6511 factory.go:1336] Added *v1.Node event handler 7\\\\nI1215 13:54:56.618173 6511 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1215 13:54:56.618436 6511 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1215 13:54:56.618506 6511 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1215 13:54:56.618537 6511 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:56.618559 6511 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:54:56.618661 6511 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.423303 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.427974 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.428014 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.428024 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.428038 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.428047 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:10Z","lastTransitionTime":"2025-12-15T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.440288 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.461147 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:08Z\\\",\\\"message\\\":\\\"2025-12-15T13:54:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea\\\\n2025-12-15T13:54:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea to /host/opt/cni/bin/\\\\n2025-12-15T13:54:23Z [verbose] multus-daemon started\\\\n2025-12-15T13:54:23Z [verbose] Readiness Indicator file check\\\\n2025-12-15T13:55:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.477994 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:10Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.531525 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.531609 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.531627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.531652 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.531670 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:10Z","lastTransitionTime":"2025-12-15T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.633967 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.634011 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.634024 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.634041 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.634056 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:10Z","lastTransitionTime":"2025-12-15T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.743135 4794 scope.go:117] "RemoveContainer" containerID="e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.743277 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.743317 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.743326 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:10 crc kubenswrapper[4794]: E1215 13:55:10.743321 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.743340 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.743353 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:10Z","lastTransitionTime":"2025-12-15T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.846431 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.846474 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.846485 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.846500 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.846512 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:10Z","lastTransitionTime":"2025-12-15T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.949243 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.949326 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.949350 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.949377 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:10 crc kubenswrapper[4794]: I1215 13:55:10.949394 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:10Z","lastTransitionTime":"2025-12-15T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.052406 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.052443 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.052454 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.052473 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.052484 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:11Z","lastTransitionTime":"2025-12-15T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.154514 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.154564 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.154600 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.154618 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.154630 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:11Z","lastTransitionTime":"2025-12-15T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.257301 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.257331 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.257341 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.257355 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.257366 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:11Z","lastTransitionTime":"2025-12-15T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.359606 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.359638 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.359646 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.359659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.359668 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:11Z","lastTransitionTime":"2025-12-15T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.461984 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.462030 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.462044 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.462063 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.462077 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:11Z","lastTransitionTime":"2025-12-15T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.564864 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.564924 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.564938 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.564952 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.564964 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:11Z","lastTransitionTime":"2025-12-15T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.667413 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.667445 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.667458 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.667473 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.667485 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:11Z","lastTransitionTime":"2025-12-15T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.736102 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.736141 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.736148 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.736159 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:11 crc kubenswrapper[4794]: E1215 13:55:11.736300 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:11 crc kubenswrapper[4794]: E1215 13:55:11.736355 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:11 crc kubenswrapper[4794]: E1215 13:55:11.736428 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:11 crc kubenswrapper[4794]: E1215 13:55:11.736530 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.769950 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.770002 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.770024 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.770043 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.770054 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:11Z","lastTransitionTime":"2025-12-15T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.872711 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.872792 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.872805 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.872824 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.872835 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:11Z","lastTransitionTime":"2025-12-15T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.974635 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.974676 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.974688 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.974727 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:11 crc kubenswrapper[4794]: I1215 13:55:11.974739 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:11Z","lastTransitionTime":"2025-12-15T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.076916 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.076959 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.076968 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.076981 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.076990 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:12Z","lastTransitionTime":"2025-12-15T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.179636 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.179704 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.179721 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.179755 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.179773 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:12Z","lastTransitionTime":"2025-12-15T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.282816 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.282929 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.282952 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.282980 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.283001 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:12Z","lastTransitionTime":"2025-12-15T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.385778 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.385825 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.385841 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.385863 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.385883 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:12Z","lastTransitionTime":"2025-12-15T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.488028 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.488057 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.488068 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.488085 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.488096 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:12Z","lastTransitionTime":"2025-12-15T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.590032 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.590078 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.590088 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.590103 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.590114 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:12Z","lastTransitionTime":"2025-12-15T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.692293 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.692335 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.692349 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.692366 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.692376 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:12Z","lastTransitionTime":"2025-12-15T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.795457 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.795729 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.795842 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.795940 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.796019 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:12Z","lastTransitionTime":"2025-12-15T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.897915 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.897998 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.898021 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.898053 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.898076 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:12Z","lastTransitionTime":"2025-12-15T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.974021 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.974090 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.974113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.974144 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.974165 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:12Z","lastTransitionTime":"2025-12-15T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:12 crc kubenswrapper[4794]: E1215 13:55:12.993027 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:12Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.996614 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.996743 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.996831 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.996934 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:12 crc kubenswrapper[4794]: I1215 13:55:12.997025 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:12Z","lastTransitionTime":"2025-12-15T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: E1215 13:55:13.013007 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:13Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.017113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.017343 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.017479 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.017646 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.017794 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: E1215 13:55:13.032082 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:13Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.036834 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.036864 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.036872 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.036887 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.036896 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: E1215 13:55:13.052895 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:13Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.056444 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.056479 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.056495 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.056514 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.056530 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: E1215 13:55:13.073543 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:13Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:13 crc kubenswrapper[4794]: E1215 13:55:13.073797 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.075798 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.075820 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.075828 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.075837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.075846 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.178793 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.178856 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.178873 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.178896 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.178913 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.281396 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.281448 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.281469 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.281491 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.281508 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.384149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.384218 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.384240 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.384268 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.384284 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.486538 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.486610 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.486627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.486646 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.486661 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.589872 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.589924 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.589946 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.589978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.590002 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.693422 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.693467 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.693476 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.693490 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.693498 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.736685 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.736724 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.736792 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:13 crc kubenswrapper[4794]: E1215 13:55:13.736870 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.736915 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:13 crc kubenswrapper[4794]: E1215 13:55:13.737287 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:13 crc kubenswrapper[4794]: E1215 13:55:13.737490 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:13 crc kubenswrapper[4794]: E1215 13:55:13.737673 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.795573 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.795910 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.795919 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.795932 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.795941 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.897828 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.897892 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.897909 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.897935 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:13 crc kubenswrapper[4794]: I1215 13:55:13.897953 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:13Z","lastTransitionTime":"2025-12-15T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.000162 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.000230 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.000252 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.000282 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.000305 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:14Z","lastTransitionTime":"2025-12-15T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.103629 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.103692 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.103709 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.103734 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.103752 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:14Z","lastTransitionTime":"2025-12-15T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.205815 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.205867 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.205889 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.205918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.205940 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:14Z","lastTransitionTime":"2025-12-15T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.308157 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.308217 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.308234 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.308262 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.308279 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:14Z","lastTransitionTime":"2025-12-15T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.410524 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.410617 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.410640 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.410668 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.410692 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:14Z","lastTransitionTime":"2025-12-15T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.513493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.513529 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.513542 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.513559 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.513571 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:14Z","lastTransitionTime":"2025-12-15T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.616252 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.616293 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.616302 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.616316 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.616326 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:14Z","lastTransitionTime":"2025-12-15T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.718695 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.718754 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.718844 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.718875 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.718898 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:14Z","lastTransitionTime":"2025-12-15T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.821310 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.821374 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.821390 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.821413 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.821430 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:14Z","lastTransitionTime":"2025-12-15T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.925536 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.925639 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.925662 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.925691 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:14 crc kubenswrapper[4794]: I1215 13:55:14.925712 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:14Z","lastTransitionTime":"2025-12-15T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.028315 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.028360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.028371 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.028388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.028400 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:15Z","lastTransitionTime":"2025-12-15T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.131737 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.131798 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.131816 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.131841 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.131861 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:15Z","lastTransitionTime":"2025-12-15T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.234264 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.234323 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.234340 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.234360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.234375 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:15Z","lastTransitionTime":"2025-12-15T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.337320 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.337386 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.337407 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.337432 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.337449 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:15Z","lastTransitionTime":"2025-12-15T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.440118 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.440183 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.440204 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.440229 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.440252 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:15Z","lastTransitionTime":"2025-12-15T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.543119 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.543190 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.543213 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.543245 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.543270 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:15Z","lastTransitionTime":"2025-12-15T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.667058 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.667131 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.667154 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.667185 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.667206 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:15Z","lastTransitionTime":"2025-12-15T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.736167 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.736227 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.736190 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.736416 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:15 crc kubenswrapper[4794]: E1215 13:55:15.736536 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:15 crc kubenswrapper[4794]: E1215 13:55:15.736738 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:15 crc kubenswrapper[4794]: E1215 13:55:15.736794 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:15 crc kubenswrapper[4794]: E1215 13:55:15.736895 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.770075 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.770113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.770123 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.770140 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.770151 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:15Z","lastTransitionTime":"2025-12-15T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.872072 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.872112 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.872127 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.872169 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.872181 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:15Z","lastTransitionTime":"2025-12-15T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.976984 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.977047 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.977067 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.977092 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:15 crc kubenswrapper[4794]: I1215 13:55:15.977115 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:15Z","lastTransitionTime":"2025-12-15T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.080235 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.080314 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.080333 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.080387 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.080404 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:16Z","lastTransitionTime":"2025-12-15T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.183396 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.183472 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.183510 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.183542 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.183567 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:16Z","lastTransitionTime":"2025-12-15T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.287254 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.287342 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.287360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.287413 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.287431 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:16Z","lastTransitionTime":"2025-12-15T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.390307 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.390384 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.390411 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.390442 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.390470 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:16Z","lastTransitionTime":"2025-12-15T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.493238 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.493285 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.493298 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.493317 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.493327 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:16Z","lastTransitionTime":"2025-12-15T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.597507 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.597839 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.597910 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.598276 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.598342 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:16Z","lastTransitionTime":"2025-12-15T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.702199 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.702298 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.702319 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.702344 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.702363 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:16Z","lastTransitionTime":"2025-12-15T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.805568 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.805659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.805676 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.805698 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.805715 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:16Z","lastTransitionTime":"2025-12-15T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.908297 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.908357 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.908379 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.908401 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:16 crc kubenswrapper[4794]: I1215 13:55:16.908418 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:16Z","lastTransitionTime":"2025-12-15T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.011738 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.011823 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.011840 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.011863 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.011880 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:17Z","lastTransitionTime":"2025-12-15T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.115016 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.115067 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.115084 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.115106 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.115122 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:17Z","lastTransitionTime":"2025-12-15T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.217623 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.217714 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.217738 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.217776 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.217800 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:17Z","lastTransitionTime":"2025-12-15T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.320139 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.320212 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.320235 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.320265 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.320286 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:17Z","lastTransitionTime":"2025-12-15T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.423139 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.423196 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.423214 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.423239 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.423257 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:17Z","lastTransitionTime":"2025-12-15T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.526353 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.526425 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.526447 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.526480 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.526500 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:17Z","lastTransitionTime":"2025-12-15T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.629068 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.629121 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.629137 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.629156 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.629172 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:17Z","lastTransitionTime":"2025-12-15T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.732360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.732476 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.732500 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.732531 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.732554 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:17Z","lastTransitionTime":"2025-12-15T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.736949 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.736986 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.737028 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.737128 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:17 crc kubenswrapper[4794]: E1215 13:55:17.737315 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:17 crc kubenswrapper[4794]: E1215 13:55:17.737496 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:17 crc kubenswrapper[4794]: E1215 13:55:17.737688 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:17 crc kubenswrapper[4794]: E1215 13:55:17.737874 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.835652 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.835726 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.835749 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.835777 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.835798 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:17Z","lastTransitionTime":"2025-12-15T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.939393 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.939461 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.939484 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.939512 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:17 crc kubenswrapper[4794]: I1215 13:55:17.939532 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:17Z","lastTransitionTime":"2025-12-15T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.043070 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.043159 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.043192 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.043225 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.043251 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:18Z","lastTransitionTime":"2025-12-15T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.146975 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.147037 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.147060 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.147085 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.147103 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:18Z","lastTransitionTime":"2025-12-15T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.250198 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.250277 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.250301 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.250332 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.250355 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:18Z","lastTransitionTime":"2025-12-15T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.354087 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.354154 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.354177 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.354204 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.354224 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:18Z","lastTransitionTime":"2025-12-15T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.457523 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.457627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.457647 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.457672 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.457691 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:18Z","lastTransitionTime":"2025-12-15T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.561402 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.561499 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.561561 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.561651 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.561678 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:18Z","lastTransitionTime":"2025-12-15T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.665012 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.665080 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.665091 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.665109 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.665124 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:18Z","lastTransitionTime":"2025-12-15T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.754054 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.767844 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.767871 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.767918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.767932 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.767941 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:18Z","lastTransitionTime":"2025-12-15T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.769040 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.784714 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.805200 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.827014 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.847709 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.869924 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.869971 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.869981 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.869996 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.870007 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:18Z","lastTransitionTime":"2025-12-15T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.870896 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.902573 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:57Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.617815 6511 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.618129 6511 obj_retry.go:551] Creating *factory.egressNode crc took: 1.830702ms\\\\nI1215 13:54:56.618151 6511 factory.go:1336] Added *v1.Node event handler 7\\\\nI1215 13:54:56.618173 6511 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1215 13:54:56.618436 6511 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1215 13:54:56.618506 6511 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1215 13:54:56.618537 6511 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:56.618559 6511 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:54:56.618661 6511 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.919128 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.933350 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1252f7a3-5ea8-4df8-9070-31dcdea637c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d94910a84079baf96b67a0d8d90c33cb2d524a83951541f58c2b2c01aba5d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.956671 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.972774 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.972844 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.972863 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.972888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.972909 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:18Z","lastTransitionTime":"2025-12-15T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.976808 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:08Z\\\",\\\"message\\\":\\\"2025-12-15T13:54:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea\\\\n2025-12-15T13:54:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea to /host/opt/cni/bin/\\\\n2025-12-15T13:54:23Z [verbose] multus-daemon started\\\\n2025-12-15T13:54:23Z [verbose] Readiness Indicator file check\\\\n2025-12-15T13:55:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:18 crc kubenswrapper[4794]: I1215 13:55:18.995930 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:18Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.009976 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:19Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.022279 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:19Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.032682 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:19Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.047447 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:19Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.057935 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:19Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.076723 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.076762 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.076774 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.076792 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.076804 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:19Z","lastTransitionTime":"2025-12-15T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.179569 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.179640 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.179658 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.179676 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.179689 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:19Z","lastTransitionTime":"2025-12-15T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.282526 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.282630 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.282647 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.282671 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.282687 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:19Z","lastTransitionTime":"2025-12-15T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.385737 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.385817 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.385850 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.385937 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.385990 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:19Z","lastTransitionTime":"2025-12-15T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.488633 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.488682 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.488694 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.488714 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.488726 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:19Z","lastTransitionTime":"2025-12-15T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.591223 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.591277 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.591295 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.591317 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.591336 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:19Z","lastTransitionTime":"2025-12-15T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.694471 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.694566 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.694640 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.694709 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.694731 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:19Z","lastTransitionTime":"2025-12-15T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.736552 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.736679 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.736566 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.736565 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:19 crc kubenswrapper[4794]: E1215 13:55:19.736795 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:19 crc kubenswrapper[4794]: E1215 13:55:19.736947 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:19 crc kubenswrapper[4794]: E1215 13:55:19.737129 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:19 crc kubenswrapper[4794]: E1215 13:55:19.737181 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.797576 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.797685 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.797705 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.797732 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.797754 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:19Z","lastTransitionTime":"2025-12-15T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.901721 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.901784 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.901803 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.901827 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:19 crc kubenswrapper[4794]: I1215 13:55:19.901846 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:19Z","lastTransitionTime":"2025-12-15T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.005159 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.005205 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.005220 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.005236 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.005247 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:20Z","lastTransitionTime":"2025-12-15T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.107847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.107905 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.107925 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.107949 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.107967 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:20Z","lastTransitionTime":"2025-12-15T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.211618 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.211671 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.211683 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.211703 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.211716 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:20Z","lastTransitionTime":"2025-12-15T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.313995 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.314042 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.314056 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.314073 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.314084 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:20Z","lastTransitionTime":"2025-12-15T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.417229 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.417280 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.417293 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.417314 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.417325 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:20Z","lastTransitionTime":"2025-12-15T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.520714 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.520795 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.520818 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.520847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.520866 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:20Z","lastTransitionTime":"2025-12-15T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.623930 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.623990 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.624008 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.624032 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.624051 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:20Z","lastTransitionTime":"2025-12-15T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.727089 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.727153 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.727169 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.727206 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.727257 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:20Z","lastTransitionTime":"2025-12-15T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.831200 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.831438 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.831464 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.831495 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.831517 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:20Z","lastTransitionTime":"2025-12-15T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.934360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.934429 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.934455 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.934486 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:20 crc kubenswrapper[4794]: I1215 13:55:20.934507 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:20Z","lastTransitionTime":"2025-12-15T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.037276 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.037343 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.037365 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.037399 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.037424 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:21Z","lastTransitionTime":"2025-12-15T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.140046 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.140090 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.140102 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.140118 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.140129 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:21Z","lastTransitionTime":"2025-12-15T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.242983 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.243030 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.243040 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.243057 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.243071 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:21Z","lastTransitionTime":"2025-12-15T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.346028 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.346088 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.346105 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.346128 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.346144 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:21Z","lastTransitionTime":"2025-12-15T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.461290 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.461352 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.461374 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.461444 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.461465 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:21Z","lastTransitionTime":"2025-12-15T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.566109 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.566202 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.566223 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.566250 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.566279 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:21Z","lastTransitionTime":"2025-12-15T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.669466 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.669521 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.669538 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.669561 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.669578 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:21Z","lastTransitionTime":"2025-12-15T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.737079 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.737086 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.737283 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.737345 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.737429 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.737757 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.737881 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.737978 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.773170 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.773214 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.773230 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.773252 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.773270 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:21Z","lastTransitionTime":"2025-12-15T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.789008 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789211 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:25.78917927 +0000 UTC m=+147.641201748 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.789292 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.789349 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.789408 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.789460 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789566 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789615 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789633 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789635 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789654 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789679 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:56:25.789665563 +0000 UTC m=+147.641688041 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789632 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789736 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789741 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:56:25.789706874 +0000 UTC m=+147.641729352 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789751 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789774 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-15 13:56:25.789760286 +0000 UTC m=+147.641782754 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:55:21 crc kubenswrapper[4794]: E1215 13:55:21.789835 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-15 13:56:25.789806467 +0000 UTC m=+147.641828975 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.876203 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.876274 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.876292 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.876320 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.876340 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:21Z","lastTransitionTime":"2025-12-15T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.978922 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.978964 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.978978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.978998 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:21 crc kubenswrapper[4794]: I1215 13:55:21.979013 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:21Z","lastTransitionTime":"2025-12-15T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.081786 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.081847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.081864 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.081888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.081905 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:22Z","lastTransitionTime":"2025-12-15T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.185843 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.185897 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.185913 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.185936 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.185953 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:22Z","lastTransitionTime":"2025-12-15T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.288436 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.288481 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.288495 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.288511 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.288547 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:22Z","lastTransitionTime":"2025-12-15T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.391875 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.391921 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.391930 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.391946 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.391955 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:22Z","lastTransitionTime":"2025-12-15T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.494341 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.494399 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.494413 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.494433 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.494448 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:22Z","lastTransitionTime":"2025-12-15T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.597610 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.597688 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.597712 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.597744 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.597770 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:22Z","lastTransitionTime":"2025-12-15T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.700634 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.700671 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.700681 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.700695 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.700724 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:22Z","lastTransitionTime":"2025-12-15T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.803837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.803913 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.803932 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.803963 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.803983 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:22Z","lastTransitionTime":"2025-12-15T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.906735 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.906787 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.906804 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.906826 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:22 crc kubenswrapper[4794]: I1215 13:55:22.906842 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:22Z","lastTransitionTime":"2025-12-15T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.009540 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.009665 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.009691 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.009725 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.009751 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.112936 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.113015 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.113033 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.113057 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.113241 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.216397 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.216441 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.216455 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.216476 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.216492 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.319531 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.319623 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.319637 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.319656 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.319671 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.376185 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.376264 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.376289 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.376318 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.376340 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: E1215 13:55:23.397179 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:23Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.401958 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.402041 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.402065 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.402097 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.402116 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: E1215 13:55:23.422057 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:23Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.427566 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.427679 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.427702 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.427735 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.427757 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: E1215 13:55:23.449776 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:23Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.455748 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.455833 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.455847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.455873 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.455887 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: E1215 13:55:23.473987 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:23Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.479308 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.479347 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.479358 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.479380 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.479395 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: E1215 13:55:23.498924 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:23Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:23 crc kubenswrapper[4794]: E1215 13:55:23.499219 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.501884 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.501942 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.501984 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.502019 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.502046 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.604967 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.605042 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.605077 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.605108 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.605130 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.709339 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.709452 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.709469 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.709493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.709510 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.736214 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.736296 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.736303 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.736249 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:23 crc kubenswrapper[4794]: E1215 13:55:23.736508 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:23 crc kubenswrapper[4794]: E1215 13:55:23.736687 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:23 crc kubenswrapper[4794]: E1215 13:55:23.736901 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:23 crc kubenswrapper[4794]: E1215 13:55:23.737035 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.813279 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.813345 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.813362 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.813386 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.813404 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.916097 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.916183 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.916205 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.916232 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:23 crc kubenswrapper[4794]: I1215 13:55:23.916252 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:23Z","lastTransitionTime":"2025-12-15T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.018511 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.018560 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.018570 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.018608 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.018620 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:24Z","lastTransitionTime":"2025-12-15T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.122034 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.122076 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.122084 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.122100 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.122110 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:24Z","lastTransitionTime":"2025-12-15T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.225517 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.225610 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.225634 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.225664 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.225688 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:24Z","lastTransitionTime":"2025-12-15T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.329084 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.329142 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.329159 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.329181 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.329195 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:24Z","lastTransitionTime":"2025-12-15T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.432513 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.432615 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.432635 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.432661 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.432678 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:24Z","lastTransitionTime":"2025-12-15T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.535695 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.535771 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.535790 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.535815 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.535834 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:24Z","lastTransitionTime":"2025-12-15T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.639046 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.639099 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.639111 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.639129 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.639141 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:24Z","lastTransitionTime":"2025-12-15T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.740721 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.740781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.740799 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.740822 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.740838 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:24Z","lastTransitionTime":"2025-12-15T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.843658 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.843719 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.843730 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.843747 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.843759 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:24Z","lastTransitionTime":"2025-12-15T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.947007 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.947082 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.947106 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.947154 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:24 crc kubenswrapper[4794]: I1215 13:55:24.947176 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:24Z","lastTransitionTime":"2025-12-15T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.049510 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.049552 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.049564 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.049599 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.049610 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:25Z","lastTransitionTime":"2025-12-15T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.151530 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.151602 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.151618 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.151636 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.151650 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:25Z","lastTransitionTime":"2025-12-15T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.254063 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.254121 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.254138 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.254167 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.254190 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:25Z","lastTransitionTime":"2025-12-15T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.355866 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.355926 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.355946 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.355973 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.355992 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:25Z","lastTransitionTime":"2025-12-15T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.459192 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.459295 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.459322 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.459357 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.459381 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:25Z","lastTransitionTime":"2025-12-15T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.563153 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.563216 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.563231 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.563253 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.563268 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:25Z","lastTransitionTime":"2025-12-15T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.666303 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.666368 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.666390 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.666422 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.666447 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:25Z","lastTransitionTime":"2025-12-15T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.736144 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:25 crc kubenswrapper[4794]: E1215 13:55:25.736369 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.736410 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.736427 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.736448 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:25 crc kubenswrapper[4794]: E1215 13:55:25.736539 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:25 crc kubenswrapper[4794]: E1215 13:55:25.736692 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:25 crc kubenswrapper[4794]: E1215 13:55:25.737332 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.737397 4794 scope.go:117] "RemoveContainer" containerID="e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.769791 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.769841 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.769852 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.769886 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.769898 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:25Z","lastTransitionTime":"2025-12-15T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.872222 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.872575 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.872628 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.872644 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.872656 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:25Z","lastTransitionTime":"2025-12-15T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.975978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.976016 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.976030 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.976043 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:25 crc kubenswrapper[4794]: I1215 13:55:25.976054 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:25Z","lastTransitionTime":"2025-12-15T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.078629 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.078676 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.078687 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.078703 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.078713 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:26Z","lastTransitionTime":"2025-12-15T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.180911 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.180957 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.180970 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.180990 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.181003 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:26Z","lastTransitionTime":"2025-12-15T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.260786 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/2.log" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.263177 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c"} Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.263559 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.279988 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.283391 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.283417 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.283427 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.283440 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.283448 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:26Z","lastTransitionTime":"2025-12-15T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.292094 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.302770 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.317028 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.331943 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.346675 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.361360 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.373117 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.385373 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.385436 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.385452 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.385467 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.385477 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:26Z","lastTransitionTime":"2025-12-15T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.389559 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:57Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.617815 6511 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.618129 6511 obj_retry.go:551] Creating *factory.egressNode crc took: 1.830702ms\\\\nI1215 13:54:56.618151 6511 factory.go:1336] Added *v1.Node event handler 7\\\\nI1215 13:54:56.618173 6511 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1215 13:54:56.618436 6511 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1215 13:54:56.618506 6511 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1215 13:54:56.618537 6511 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:56.618559 6511 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:54:56.618661 6511 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.398191 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.406924 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1252f7a3-5ea8-4df8-9070-31dcdea637c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d94910a84079baf96b67a0d8d90c33cb2d524a83951541f58c2b2c01aba5d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.416643 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.428684 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:08Z\\\",\\\"message\\\":\\\"2025-12-15T13:54:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea\\\\n2025-12-15T13:54:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea to /host/opt/cni/bin/\\\\n2025-12-15T13:54:23Z [verbose] multus-daemon started\\\\n2025-12-15T13:54:23Z [verbose] Readiness Indicator file check\\\\n2025-12-15T13:55:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.439645 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.450838 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.460700 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.470916 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.485106 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:26Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.487439 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.487467 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.487478 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.487493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.487504 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:26Z","lastTransitionTime":"2025-12-15T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.589883 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.589921 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.589933 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.589948 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.589958 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:26Z","lastTransitionTime":"2025-12-15T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.692968 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.693029 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.693050 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.693107 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.693127 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:26Z","lastTransitionTime":"2025-12-15T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.797075 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.797165 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.797182 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.797207 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.797224 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:26Z","lastTransitionTime":"2025-12-15T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.899435 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.899466 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.899474 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.899487 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:26 crc kubenswrapper[4794]: I1215 13:55:26.899495 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:26Z","lastTransitionTime":"2025-12-15T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.001570 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.001628 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.001643 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.001658 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.001670 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:27Z","lastTransitionTime":"2025-12-15T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.105182 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.105258 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.105281 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.105310 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.105332 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:27Z","lastTransitionTime":"2025-12-15T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.207940 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.207998 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.208014 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.208037 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.208053 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:27Z","lastTransitionTime":"2025-12-15T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.270668 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/3.log" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.271977 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/2.log" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.278144 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c" exitCode=1 Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.278219 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c"} Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.278291 4794 scope.go:117] "RemoveContainer" containerID="e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.279424 4794 scope.go:117] "RemoveContainer" containerID="a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c" Dec 15 13:55:27 crc kubenswrapper[4794]: E1215 13:55:27.279831 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.298960 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.310643 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.310692 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.310708 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.310728 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.310744 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:27Z","lastTransitionTime":"2025-12-15T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.320234 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:08Z\\\",\\\"message\\\":\\\"2025-12-15T13:54:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea\\\\n2025-12-15T13:54:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea to /host/opt/cni/bin/\\\\n2025-12-15T13:54:23Z [verbose] multus-daemon started\\\\n2025-12-15T13:54:23Z [verbose] Readiness Indicator file check\\\\n2025-12-15T13:55:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.335006 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.352996 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.367881 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.383163 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.397302 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.413199 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.413253 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.413269 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.413289 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.413305 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:27Z","lastTransitionTime":"2025-12-15T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.416931 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.435520 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.454121 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.474309 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.494340 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.516731 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.516811 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.516837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.516865 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.516882 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:27Z","lastTransitionTime":"2025-12-15T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.519229 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.534172 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1252f7a3-5ea8-4df8-9070-31dcdea637c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d94910a84079baf96b67a0d8d90c33cb2d524a83951541f58c2b2c01aba5d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.552828 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.566986 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.596335 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f6d7f1ff5cb4c65ba506d80824e0858ceb358d7ef27f05181bcecfb1897500\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:54:57Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.617815 6511 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1215 13:54:56.618129 6511 obj_retry.go:551] Creating *factory.egressNode crc took: 1.830702ms\\\\nI1215 13:54:56.618151 6511 factory.go:1336] Added *v1.Node event handler 7\\\\nI1215 13:54:56.618173 6511 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1215 13:54:56.618436 6511 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1215 13:54:56.618506 6511 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1215 13:54:56.618537 6511 ovnkube.go:599] Stopped ovnkube\\\\nI1215 13:54:56.618559 6511 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:54:56.618661 6511 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:26Z\\\",\\\"message\\\":\\\"-config-operator/kube-rbac-proxy-crio-crc openshift-machine-config-operator/machine-config-daemon-fq2s6 openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI1215 13:55:26.561380 6963 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1215 13:55:26.561395 6963 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561407 6963 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561416 6963 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1215 13:55:26.561421 6963 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1215 13:55:26.561427 6963 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561439 6963 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1215 13:55:26.561452 6963 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:55:26.561501 6963 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.608880 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:27Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.619874 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.619909 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.619930 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.619951 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.619966 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:27Z","lastTransitionTime":"2025-12-15T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.722658 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.722693 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.722704 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.722719 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.722729 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:27Z","lastTransitionTime":"2025-12-15T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.736174 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.736168 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.736178 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.736240 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:27 crc kubenswrapper[4794]: E1215 13:55:27.736397 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:27 crc kubenswrapper[4794]: E1215 13:55:27.736486 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:27 crc kubenswrapper[4794]: E1215 13:55:27.736560 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:27 crc kubenswrapper[4794]: E1215 13:55:27.736925 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.824626 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.824683 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.824699 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.824721 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.824739 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:27Z","lastTransitionTime":"2025-12-15T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.928028 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.928104 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.928139 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.928169 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:27 crc kubenswrapper[4794]: I1215 13:55:27.928191 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:27Z","lastTransitionTime":"2025-12-15T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.031055 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.031098 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.031113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.031136 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.031156 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:28Z","lastTransitionTime":"2025-12-15T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.134003 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.134055 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.134077 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.134097 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.134111 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:28Z","lastTransitionTime":"2025-12-15T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.237405 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.237461 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.237489 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.237532 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.237555 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:28Z","lastTransitionTime":"2025-12-15T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.284935 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/3.log" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.291116 4794 scope.go:117] "RemoveContainer" containerID="a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c" Dec 15 13:55:28 crc kubenswrapper[4794]: E1215 13:55:28.291376 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.306929 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.322448 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1252f7a3-5ea8-4df8-9070-31dcdea637c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d94910a84079baf96b67a0d8d90c33cb2d524a83951541f58c2b2c01aba5d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.340616 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.341535 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.341637 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.341661 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.341690 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.341710 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:28Z","lastTransitionTime":"2025-12-15T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.363350 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.389422 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:26Z\\\",\\\"message\\\":\\\"-config-operator/kube-rbac-proxy-crio-crc openshift-machine-config-operator/machine-config-daemon-fq2s6 openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI1215 13:55:26.561380 6963 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1215 13:55:26.561395 6963 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561407 6963 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561416 6963 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1215 13:55:26.561421 6963 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1215 13:55:26.561427 6963 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561439 6963 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1215 13:55:26.561452 6963 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:55:26.561501 6963 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:55:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.404833 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.428123 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:08Z\\\",\\\"message\\\":\\\"2025-12-15T13:54:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea\\\\n2025-12-15T13:54:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea to /host/opt/cni/bin/\\\\n2025-12-15T13:54:23Z [verbose] multus-daemon started\\\\n2025-12-15T13:54:23Z [verbose] Readiness Indicator file check\\\\n2025-12-15T13:55:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.442181 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.444098 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.444184 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.444209 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.444239 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.444262 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:28Z","lastTransitionTime":"2025-12-15T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.463685 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.477901 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.492872 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.509371 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.529496 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.547274 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.547647 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.547712 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.547735 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.547767 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.547790 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:28Z","lastTransitionTime":"2025-12-15T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.565856 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.581744 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.593232 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.605719 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.650606 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.650647 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.650659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.650675 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.650686 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:28Z","lastTransitionTime":"2025-12-15T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.754179 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.754289 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.754315 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.754388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.754412 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:28Z","lastTransitionTime":"2025-12-15T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.759986 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.781734 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.813683 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.864003 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.864065 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.864080 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.864101 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.864117 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:28Z","lastTransitionTime":"2025-12-15T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.869039 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.896242 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.907365 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.926460 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:26Z\\\",\\\"message\\\":\\\"-config-operator/kube-rbac-proxy-crio-crc openshift-machine-config-operator/machine-config-daemon-fq2s6 openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI1215 13:55:26.561380 6963 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1215 13:55:26.561395 6963 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561407 6963 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561416 6963 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1215 13:55:26.561421 6963 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1215 13:55:26.561427 6963 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561439 6963 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1215 13:55:26.561452 6963 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:55:26.561501 6963 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:55:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.935964 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.947185 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1252f7a3-5ea8-4df8-9070-31dcdea637c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d94910a84079baf96b67a0d8d90c33cb2d524a83951541f58c2b2c01aba5d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.959534 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.965924 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.965948 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.965957 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.965970 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.965979 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:28Z","lastTransitionTime":"2025-12-15T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.972556 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.984763 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:28 crc kubenswrapper[4794]: I1215 13:55:28.997525 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:28Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.012213 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:08Z\\\",\\\"message\\\":\\\"2025-12-15T13:54:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea\\\\n2025-12-15T13:54:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea to /host/opt/cni/bin/\\\\n2025-12-15T13:54:23Z [verbose] multus-daemon started\\\\n2025-12-15T13:54:23Z [verbose] Readiness Indicator file check\\\\n2025-12-15T13:55:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:29Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.025747 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:29Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.038425 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:29Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.049874 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:29Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.064710 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:29Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.068545 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.068636 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.068658 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.068682 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.068701 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:29Z","lastTransitionTime":"2025-12-15T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.171041 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.171117 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.171141 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.171174 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.171197 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:29Z","lastTransitionTime":"2025-12-15T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.275321 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.275428 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.275455 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.275486 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.275509 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:29Z","lastTransitionTime":"2025-12-15T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.377770 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.377859 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.377884 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.377915 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.377942 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:29Z","lastTransitionTime":"2025-12-15T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.480761 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.480792 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.480821 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.480839 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.480856 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:29Z","lastTransitionTime":"2025-12-15T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.583808 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.583872 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.583889 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.583916 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.583939 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:29Z","lastTransitionTime":"2025-12-15T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.686785 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.686848 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.686866 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.686890 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.686909 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:29Z","lastTransitionTime":"2025-12-15T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.736806 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.736898 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.736994 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:29 crc kubenswrapper[4794]: E1215 13:55:29.737170 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.737214 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:29 crc kubenswrapper[4794]: E1215 13:55:29.737356 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:29 crc kubenswrapper[4794]: E1215 13:55:29.737809 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:29 crc kubenswrapper[4794]: E1215 13:55:29.737897 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.790425 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.790488 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.790510 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.790541 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.790567 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:29Z","lastTransitionTime":"2025-12-15T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.893748 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.893785 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.893796 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.893811 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.893822 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:29Z","lastTransitionTime":"2025-12-15T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.997806 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.997919 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.997938 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.997997 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:29 crc kubenswrapper[4794]: I1215 13:55:29.998020 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:29Z","lastTransitionTime":"2025-12-15T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.104835 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.104887 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.104902 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.104932 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.105052 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:30Z","lastTransitionTime":"2025-12-15T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.209005 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.209444 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.209661 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.209884 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.210068 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:30Z","lastTransitionTime":"2025-12-15T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.313259 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.313564 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.313622 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.313652 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.313672 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:30Z","lastTransitionTime":"2025-12-15T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.416475 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.416558 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.416628 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.416663 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.416686 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:30Z","lastTransitionTime":"2025-12-15T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.519562 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.519668 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.519687 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.519713 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.519733 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:30Z","lastTransitionTime":"2025-12-15T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.622796 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.622868 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.622890 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.622920 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.622943 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:30Z","lastTransitionTime":"2025-12-15T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.726052 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.726106 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.726125 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.726151 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.726177 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:30Z","lastTransitionTime":"2025-12-15T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.755684 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.829385 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.829420 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.829433 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.829447 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.829459 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:30Z","lastTransitionTime":"2025-12-15T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.932439 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.932495 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.932507 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.932525 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:30 crc kubenswrapper[4794]: I1215 13:55:30.932537 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:30Z","lastTransitionTime":"2025-12-15T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.034977 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.035028 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.035045 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.035070 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.035087 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:31Z","lastTransitionTime":"2025-12-15T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.138048 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.138100 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.138108 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.138126 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.138169 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:31Z","lastTransitionTime":"2025-12-15T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.240694 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.240753 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.240782 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.240805 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.240819 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:31Z","lastTransitionTime":"2025-12-15T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.342991 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.343026 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.343038 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.343054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.343064 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:31Z","lastTransitionTime":"2025-12-15T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.446203 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.446264 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.446282 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.446307 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.446323 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:31Z","lastTransitionTime":"2025-12-15T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.549365 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.549445 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.549500 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.549526 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.549561 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:31Z","lastTransitionTime":"2025-12-15T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.651990 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.652064 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.652089 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.652120 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.652142 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:31Z","lastTransitionTime":"2025-12-15T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.736354 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.736388 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.736433 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.736451 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:31 crc kubenswrapper[4794]: E1215 13:55:31.736557 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:31 crc kubenswrapper[4794]: E1215 13:55:31.736677 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:31 crc kubenswrapper[4794]: E1215 13:55:31.736960 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:31 crc kubenswrapper[4794]: E1215 13:55:31.737109 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.754527 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.754621 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.754639 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.754663 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.754679 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:31Z","lastTransitionTime":"2025-12-15T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.857495 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.857532 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.857543 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.857559 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.857572 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:31Z","lastTransitionTime":"2025-12-15T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.961060 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.961111 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.961127 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.961151 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:31 crc kubenswrapper[4794]: I1215 13:55:31.961167 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:31Z","lastTransitionTime":"2025-12-15T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.063711 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.063798 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.063816 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.063836 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.063881 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:32Z","lastTransitionTime":"2025-12-15T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.165937 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.165978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.166004 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.166023 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.166034 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:32Z","lastTransitionTime":"2025-12-15T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.269956 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.270036 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.270060 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.270091 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.270114 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:32Z","lastTransitionTime":"2025-12-15T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.374423 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.374487 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.374506 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.374527 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.374547 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:32Z","lastTransitionTime":"2025-12-15T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.477039 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.477095 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.477130 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.477153 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.477168 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:32Z","lastTransitionTime":"2025-12-15T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.580953 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.581007 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.581018 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.581036 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.581048 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:32Z","lastTransitionTime":"2025-12-15T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.683597 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.683669 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.683683 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.683698 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.683709 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:32Z","lastTransitionTime":"2025-12-15T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.787048 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.787094 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.787104 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.787124 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.787143 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:32Z","lastTransitionTime":"2025-12-15T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.889843 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.889877 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.889885 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.889901 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.889913 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:32Z","lastTransitionTime":"2025-12-15T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.993562 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.993659 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.993677 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.993707 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:32 crc kubenswrapper[4794]: I1215 13:55:32.993726 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:32Z","lastTransitionTime":"2025-12-15T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.096117 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.096151 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.096159 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.096173 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.096182 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.199239 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.199286 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.199297 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.199314 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.199327 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.301858 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.301916 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.301935 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.301961 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.301978 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.405459 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.405536 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.405559 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.405620 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.405640 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.508135 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.508226 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.508262 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.508292 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.508319 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.550454 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.550497 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.550510 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.550528 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.550540 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: E1215 13:55:33.566449 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.571214 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.571268 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.571284 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.571308 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.571324 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: E1215 13:55:33.587426 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.591734 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.591798 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.591821 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.591857 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.591881 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: E1215 13:55:33.606502 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.611384 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.611440 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.611462 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.611491 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.611512 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: E1215 13:55:33.629947 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.635179 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.635217 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.635233 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.635255 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.635272 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: E1215 13:55:33.655880 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:33Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:33 crc kubenswrapper[4794]: E1215 13:55:33.656121 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.658029 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.658095 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.658120 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.658151 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.658178 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.736761 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.736817 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.736782 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.737357 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:33 crc kubenswrapper[4794]: E1215 13:55:33.737554 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:33 crc kubenswrapper[4794]: E1215 13:55:33.737863 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:33 crc kubenswrapper[4794]: E1215 13:55:33.738146 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:33 crc kubenswrapper[4794]: E1215 13:55:33.738182 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.761337 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.761399 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.761423 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.761454 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.761475 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.864710 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.864755 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.864829 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.864853 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.864876 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.967537 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.967656 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.967685 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.967754 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:33 crc kubenswrapper[4794]: I1215 13:55:33.967779 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:33Z","lastTransitionTime":"2025-12-15T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.070610 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.070669 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.070689 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.070715 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.070732 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:34Z","lastTransitionTime":"2025-12-15T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.174286 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.174337 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.174354 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.174378 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.174395 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:34Z","lastTransitionTime":"2025-12-15T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.277894 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.277991 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.278009 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.278034 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.278055 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:34Z","lastTransitionTime":"2025-12-15T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.381647 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.381736 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.381758 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.381788 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.381809 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:34Z","lastTransitionTime":"2025-12-15T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.484966 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.485029 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.485046 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.485073 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.485090 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:34Z","lastTransitionTime":"2025-12-15T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.588490 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.588548 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.588566 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.588619 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.588636 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:34Z","lastTransitionTime":"2025-12-15T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.691026 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.691098 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.691117 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.691141 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.691162 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:34Z","lastTransitionTime":"2025-12-15T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.794408 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.794477 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.794494 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.794519 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.794540 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:34Z","lastTransitionTime":"2025-12-15T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.897027 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.897093 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.897110 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.897133 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.897151 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:34Z","lastTransitionTime":"2025-12-15T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.999360 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.999412 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.999434 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.999459 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:34 crc kubenswrapper[4794]: I1215 13:55:34.999481 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:34Z","lastTransitionTime":"2025-12-15T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.102613 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.102681 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.102700 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.102727 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.102746 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:35Z","lastTransitionTime":"2025-12-15T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.206166 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.206244 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.206261 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.206286 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.206304 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:35Z","lastTransitionTime":"2025-12-15T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.308487 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.308539 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.308550 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.308567 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.308601 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:35Z","lastTransitionTime":"2025-12-15T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.410546 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.410620 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.410633 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.410648 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.410658 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:35Z","lastTransitionTime":"2025-12-15T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.512918 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.512966 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.512976 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.512993 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.513005 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:35Z","lastTransitionTime":"2025-12-15T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.615219 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.615260 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.615270 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.615288 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.615319 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:35Z","lastTransitionTime":"2025-12-15T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.718127 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.718204 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.718221 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.718244 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.718264 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:35Z","lastTransitionTime":"2025-12-15T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.736385 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.736449 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.736457 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.736516 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:35 crc kubenswrapper[4794]: E1215 13:55:35.736691 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:35 crc kubenswrapper[4794]: E1215 13:55:35.736794 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:35 crc kubenswrapper[4794]: E1215 13:55:35.736913 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:35 crc kubenswrapper[4794]: E1215 13:55:35.737025 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.821707 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.821787 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.821811 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.821842 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.821865 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:35Z","lastTransitionTime":"2025-12-15T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.924801 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.924850 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.924860 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.924880 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:35 crc kubenswrapper[4794]: I1215 13:55:35.924892 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:35Z","lastTransitionTime":"2025-12-15T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.028345 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.028399 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.028412 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.028433 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.028447 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:36Z","lastTransitionTime":"2025-12-15T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.131187 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.131240 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.131256 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.131279 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.131295 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:36Z","lastTransitionTime":"2025-12-15T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.234290 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.234391 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.234408 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.234429 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.234443 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:36Z","lastTransitionTime":"2025-12-15T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.336392 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.336447 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.336460 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.336480 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.336494 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:36Z","lastTransitionTime":"2025-12-15T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.438728 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.438768 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.438778 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.438794 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.438836 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:36Z","lastTransitionTime":"2025-12-15T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.541544 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.541602 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.541613 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.541630 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.541641 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:36Z","lastTransitionTime":"2025-12-15T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.644661 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.644736 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.644760 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.644788 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.644814 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:36Z","lastTransitionTime":"2025-12-15T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.747019 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.747104 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.747136 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.747166 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.747186 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:36Z","lastTransitionTime":"2025-12-15T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.849904 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.849975 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.849997 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.850029 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.850052 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:36Z","lastTransitionTime":"2025-12-15T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.953091 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.953134 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.953151 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.953166 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:36 crc kubenswrapper[4794]: I1215 13:55:36.953175 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:36Z","lastTransitionTime":"2025-12-15T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.055016 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.055054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.055063 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.055095 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.055104 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:37Z","lastTransitionTime":"2025-12-15T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.158677 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.158837 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.158861 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.158885 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.158937 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:37Z","lastTransitionTime":"2025-12-15T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.261498 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.261651 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.261675 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.261715 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.261739 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:37Z","lastTransitionTime":"2025-12-15T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.364698 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.364777 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.364789 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.364808 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.364823 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:37Z","lastTransitionTime":"2025-12-15T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.467272 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.467334 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.467350 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.467372 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.467389 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:37Z","lastTransitionTime":"2025-12-15T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.570270 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.570327 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.570338 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.570357 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.570369 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:37Z","lastTransitionTime":"2025-12-15T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.672536 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.672717 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.672738 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.672765 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.672797 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:37Z","lastTransitionTime":"2025-12-15T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.736264 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.736234 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.736334 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.736365 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:37 crc kubenswrapper[4794]: E1215 13:55:37.736511 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:37 crc kubenswrapper[4794]: E1215 13:55:37.736716 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:37 crc kubenswrapper[4794]: E1215 13:55:37.736834 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:37 crc kubenswrapper[4794]: E1215 13:55:37.736921 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.775674 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.775741 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.775769 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.775799 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.775821 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:37Z","lastTransitionTime":"2025-12-15T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.869996 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:37 crc kubenswrapper[4794]: E1215 13:55:37.870218 4794 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:55:37 crc kubenswrapper[4794]: E1215 13:55:37.870360 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs podName:1f6d5d8e-1512-4d71-8363-ba6003bf10b6 nodeName:}" failed. No retries permitted until 2025-12-15 13:56:41.870316271 +0000 UTC m=+163.722338749 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs") pod "network-metrics-daemon-4xt6f" (UID: "1f6d5d8e-1512-4d71-8363-ba6003bf10b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.878933 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.878998 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.879017 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.879353 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.879550 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:37Z","lastTransitionTime":"2025-12-15T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.982482 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.982543 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.982560 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.982606 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:37 crc kubenswrapper[4794]: I1215 13:55:37.982624 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:37Z","lastTransitionTime":"2025-12-15T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.084949 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.085029 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.085054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.085087 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.085110 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:38Z","lastTransitionTime":"2025-12-15T13:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.188039 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.188095 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.188111 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.188145 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.188183 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:38Z","lastTransitionTime":"2025-12-15T13:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.291113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.291189 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.291211 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.291240 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.291262 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:38Z","lastTransitionTime":"2025-12-15T13:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.393839 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.393911 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.393935 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.393968 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.393995 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:38Z","lastTransitionTime":"2025-12-15T13:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.496563 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.496656 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.496681 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.496711 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.496734 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:38Z","lastTransitionTime":"2025-12-15T13:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.600480 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.600555 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.600608 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.600640 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.600662 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:38Z","lastTransitionTime":"2025-12-15T13:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.704124 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.704175 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.704191 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.704216 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.704235 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:38Z","lastTransitionTime":"2025-12-15T13:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.768886 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66e0f94-8995-48c8-b0dd-b9a94aaa9eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a682b7f02cbdb7ad5fabccb299baa31be6a0942fb10248ac3595e4798d3e1810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b1e9505ff1fd9a625c8ae3e46ed58d51879d19a588bd1a639ac66960654863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf61e2f0ed68373f32c905b5a350081a8761b20f411228c65e97c95334d1f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ebf2b1485a58b9f0a793b913a28b8ec77cd948921ede69683d2d8d4c92b6904\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e33a44d9f6498853ab4a8fd59d3bc6a9792986cb5c403a7c10d78d25391b5aa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0f757a06f4652e693a1d9f6990b68e4858ab7934a5ca04475e98d32a30c10da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f757a06f4652e693a1d9f6990b68e4858ab7934a5ca04475e98d32a30c10da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69ef63a985f295c4df2876df8c2b0476619c7b3ba98cc75730fc83fdc334ec2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69ef63a985f295c4df2876df8c2b0476619c7b3ba98cc75730fc83fdc334ec2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a7b8ff07a882051dbc0e0bfd320e7b28efeb622cd125ac94f6c866e958e1fb25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7b8ff07a882051dbc0e0bfd320e7b28efeb622cd125ac94f6c866e958e1fb25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.786540 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd4061a286f3988d7e282faf17de9230289e3ce6df848ceaaa77cf67adc219a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.799572 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-t9nm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bc89ecc-eb8e-4926-bbb7-14c90f449e00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:08Z\\\",\\\"message\\\":\\\"2025-12-15T13:54:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea\\\\n2025-12-15T13:54:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8e4089b0-1ca0-4447-8f08-f2ea0bda32ea to /host/opt/cni/bin/\\\\n2025-12-15T13:54:23Z [verbose] multus-daemon started\\\\n2025-12-15T13:54:23Z [verbose] Readiness Indicator file check\\\\n2025-12-15T13:55:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vtn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-t9nm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.806979 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.807008 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.807019 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.807035 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.807045 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:38Z","lastTransitionTime":"2025-12-15T13:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.811354 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08abf0e4-50ec-4ee1-a927-e6383b60ab38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eeed487f556a3d20dedbc594b788e93abb7a36106fd8fcb75376853293189a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70b6aca89c3fdcdf70a71ef078ae403e5c654f7a396b83d04b541fa46fd214e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xt52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4d4wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.824089 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.836965 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xvkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb4cf2c0-63d3-40ed-a0f9-a0c381371407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ec873567b0e1f24561d1c9d174a0975df4f2ac1948a330bf695dc20b527d36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sl4ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xvkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.849011 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3538082f-5d54-4676-a488-7a3df6b9a1f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0403a7ffa2cc8f429c4fe300c5ec59a56e315e505528d46807b3481d0e97d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxxcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fq2s6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.861127 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t42rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4xt6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.882940 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1fbcc318-495e-462d-9990-8688e8ca6584\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.902610 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66e038c3-0b4b-498e-bf75-5021bea91086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13394865229d1a00590a3ca85a7efe4723333c0b17c9e8c17d7f0fae6538843a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9dc2112afcf2d2ddf701f8c729c8b4a1a01254f0bb216be7d4c4b3a7c3b71d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff06dcbc6e70e10668ab49fa588f72e35c3a5920c79d92b1ce42404d7a04b2bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.909017 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.909043 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.909053 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.909068 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.909078 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:38Z","lastTransitionTime":"2025-12-15T13:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.918506 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"680bfb58-bda5-4da2-893e-2d4fdd2aca1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7728bcaededfd7e13b2789fb63ed0d27cf7eadf54d2d0748491b9582d1c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fa192bd16bcbeefb207dadde9c464c1e7a12d24bf7199e4e6c2bea08f5da40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2127a4d55b599e7b1b0b6306d51d67e458670c4606ade8aaade3e79784268c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76f0f68b3884e7eb4fc5095f6643ee2a805495f8bc1a66c70724d84a4b76c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.938226 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.955844 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.971237 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33db367-8090-4973-a405-f5b4c8b8479a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0f4835c2ef70cf1c56fe030ac7d8142c83e5644b362985166f4dc731ea1567e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457e861e4da998c65b74681dd27e817d1c125d795e26655cb31d18d38d2b188b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4886baa5be0df5a1d0ffa815f01ab781a3860c9cbbce884919f7468ff1c7d783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc60c47d18a7b8c1fb57b471fcd3b2a0e457071c135c7310290617c3f7a5b452\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7563635f1a603f65e32c33b1e5464bf6d37180fdb1dbc767b0a415d5531d718\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fee30805207fd6c87cfdcb1d29284d55515b38b168bd812427dc1443fd8b0c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://441c03df2c57c7d35cf7d669ba79b24a3fa8a61b04208b8ddb62b04eb7ed451c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-865qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cjbhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.984051 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1252f7a3-5ea8-4df8-9070-31dcdea637c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:53:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d94910a84079baf96b67a0d8d90c33cb2d524a83951541f58c2b2c01aba5d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdfc9d44781f8332077ad11e45021de22068a657b0397e71094733c2b3ab418\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:53:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:38 crc kubenswrapper[4794]: I1215 13:55:38.997851 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e4bd62371444b77ad7a0e74e008eecfd91723492c7c6ce1f490de635519f198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:38Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.011110 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc9c6b860593b552fbfce014ac20b79d0bee7cbeabf6dcf4ccaf825909cf6e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e09cf480ea1d7fada089a5bb4f5ef42a31f731fda69f006400c0f784eb5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.011363 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.011380 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.011388 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.011402 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.011413 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:39Z","lastTransitionTime":"2025-12-15T13:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.030551 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"628fdda9-19ac-4a1d-a93b-82a10124a8ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-15T13:55:26Z\\\",\\\"message\\\":\\\"-config-operator/kube-rbac-proxy-crio-crc openshift-machine-config-operator/machine-config-daemon-fq2s6 openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI1215 13:55:26.561380 6963 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1215 13:55:26.561395 6963 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561407 6963 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561416 6963 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI1215 13:55:26.561421 6963 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1215 13:55:26.561427 6963 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1215 13:55:26.561439 6963 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1215 13:55:26.561452 6963 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1215 13:55:26.561501 6963 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-15T13:55:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-15T13:54:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-15T13:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pxv8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cwnfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.040620 4794 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bpwnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abd16c69-cba3-49a5-adc8-92a14453db80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-15T13:54:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ab35e00910962fdad373a158518288049cc321fd57643ae27e168cb1fc2e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-15T13:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqk66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-15T13:54:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bpwnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:39Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.119821 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.119880 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.119896 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.119919 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.119935 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:39Z","lastTransitionTime":"2025-12-15T13:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.222965 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.223077 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.223103 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.223131 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.223151 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:39Z","lastTransitionTime":"2025-12-15T13:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.326188 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.326639 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.326666 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.326687 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.326792 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:39Z","lastTransitionTime":"2025-12-15T13:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.429189 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.429558 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.429755 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.429907 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.430034 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:39Z","lastTransitionTime":"2025-12-15T13:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.532962 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.533079 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.533105 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.533135 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.533158 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:39Z","lastTransitionTime":"2025-12-15T13:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.635644 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.635704 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.635716 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.635734 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.635745 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:39Z","lastTransitionTime":"2025-12-15T13:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.737052 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.737152 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.737209 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.737512 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:39 crc kubenswrapper[4794]: E1215 13:55:39.737858 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:39 crc kubenswrapper[4794]: E1215 13:55:39.737965 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:39 crc kubenswrapper[4794]: E1215 13:55:39.738121 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:39 crc kubenswrapper[4794]: E1215 13:55:39.738289 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.738995 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.739080 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.739092 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.739110 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.739121 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:39Z","lastTransitionTime":"2025-12-15T13:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.841801 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.841953 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.841985 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.842010 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.842028 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:39Z","lastTransitionTime":"2025-12-15T13:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.944995 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.945056 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.945075 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.945099 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:39 crc kubenswrapper[4794]: I1215 13:55:39.945116 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:39Z","lastTransitionTime":"2025-12-15T13:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.048970 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.049061 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.049085 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.049362 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.049425 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:40Z","lastTransitionTime":"2025-12-15T13:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.152384 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.152457 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.152481 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.152510 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.152534 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:40Z","lastTransitionTime":"2025-12-15T13:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.255635 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.255711 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.255736 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.255766 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.255790 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:40Z","lastTransitionTime":"2025-12-15T13:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.358888 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.358962 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.358984 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.359011 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.359029 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:40Z","lastTransitionTime":"2025-12-15T13:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.463168 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.463230 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.463249 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.463275 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.463293 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:40Z","lastTransitionTime":"2025-12-15T13:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.566258 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.566319 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.566336 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.566358 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.566375 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:40Z","lastTransitionTime":"2025-12-15T13:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.668863 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.668912 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.668992 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.669018 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.669036 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:40Z","lastTransitionTime":"2025-12-15T13:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.771647 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.771706 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.771723 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.771753 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.771770 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:40Z","lastTransitionTime":"2025-12-15T13:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.874847 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.874892 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.874901 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.874916 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.874925 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:40Z","lastTransitionTime":"2025-12-15T13:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.977366 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.977416 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.977444 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.977461 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:40 crc kubenswrapper[4794]: I1215 13:55:40.977473 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:40Z","lastTransitionTime":"2025-12-15T13:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.079974 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.080074 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.080130 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.080167 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.080187 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:41Z","lastTransitionTime":"2025-12-15T13:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.182870 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.182969 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.182993 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.183023 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.183045 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:41Z","lastTransitionTime":"2025-12-15T13:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.286019 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.286101 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.286125 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.286156 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.286181 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:41Z","lastTransitionTime":"2025-12-15T13:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.389460 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.389512 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.389529 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.389551 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.389568 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:41Z","lastTransitionTime":"2025-12-15T13:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.492699 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.492764 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.492787 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.492815 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.492837 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:41Z","lastTransitionTime":"2025-12-15T13:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.596001 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.596115 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.596132 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.596156 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.596173 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:41Z","lastTransitionTime":"2025-12-15T13:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.699930 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.700037 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.700053 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.700077 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.700096 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:41Z","lastTransitionTime":"2025-12-15T13:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.736836 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.736931 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:41 crc kubenswrapper[4794]: E1215 13:55:41.737044 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.737092 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.737169 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:41 crc kubenswrapper[4794]: E1215 13:55:41.737353 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:41 crc kubenswrapper[4794]: E1215 13:55:41.737453 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:41 crc kubenswrapper[4794]: E1215 13:55:41.737548 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.738655 4794 scope.go:117] "RemoveContainer" containerID="a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c" Dec 15 13:55:41 crc kubenswrapper[4794]: E1215 13:55:41.738903 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.804416 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.804484 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.804496 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.804514 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.804526 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:41Z","lastTransitionTime":"2025-12-15T13:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.907974 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.908066 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.908104 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.908134 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:41 crc kubenswrapper[4794]: I1215 13:55:41.908156 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:41Z","lastTransitionTime":"2025-12-15T13:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.011121 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.011196 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.011219 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.011249 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.011271 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:42Z","lastTransitionTime":"2025-12-15T13:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.154238 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.154313 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.154330 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.154355 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.154373 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:42Z","lastTransitionTime":"2025-12-15T13:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.257722 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.257795 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.257814 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.257839 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.257859 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:42Z","lastTransitionTime":"2025-12-15T13:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.361035 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.361085 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.361102 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.361126 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.361145 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:42Z","lastTransitionTime":"2025-12-15T13:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.463143 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.463183 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.463194 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.463209 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.463220 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:42Z","lastTransitionTime":"2025-12-15T13:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.565732 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.565807 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.565833 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.565863 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.565888 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:42Z","lastTransitionTime":"2025-12-15T13:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.668045 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.668087 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.668100 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.668120 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.668135 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:42Z","lastTransitionTime":"2025-12-15T13:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.770382 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.770447 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.770465 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.770496 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.770514 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:42Z","lastTransitionTime":"2025-12-15T13:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.874010 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.874094 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.874117 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.874149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.874171 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:42Z","lastTransitionTime":"2025-12-15T13:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.977561 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.977664 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.977686 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.977715 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:42 crc kubenswrapper[4794]: I1215 13:55:42.977738 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:42Z","lastTransitionTime":"2025-12-15T13:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.082127 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.082181 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.082205 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.082236 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.082263 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:43Z","lastTransitionTime":"2025-12-15T13:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.185978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.186032 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.186049 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.186072 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.186089 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:43Z","lastTransitionTime":"2025-12-15T13:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.288789 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.288827 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.288839 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.288854 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.288868 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:43Z","lastTransitionTime":"2025-12-15T13:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.391885 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.391963 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.391988 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.392019 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.392042 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:43Z","lastTransitionTime":"2025-12-15T13:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.495446 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.495661 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.495685 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.495708 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.495726 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:43Z","lastTransitionTime":"2025-12-15T13:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.598664 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.598756 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.598773 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.598797 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.598823 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:43Z","lastTransitionTime":"2025-12-15T13:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.702064 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.702120 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.702138 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.702161 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.702177 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:43Z","lastTransitionTime":"2025-12-15T13:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.736932 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.737020 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.737057 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.736931 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:43 crc kubenswrapper[4794]: E1215 13:55:43.737131 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:43 crc kubenswrapper[4794]: E1215 13:55:43.737310 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:43 crc kubenswrapper[4794]: E1215 13:55:43.737477 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:43 crc kubenswrapper[4794]: E1215 13:55:43.737648 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.805247 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.805308 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.805328 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.805352 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.805370 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:43Z","lastTransitionTime":"2025-12-15T13:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.907805 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.907876 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.907887 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.907903 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:43 crc kubenswrapper[4794]: I1215 13:55:43.907914 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:43Z","lastTransitionTime":"2025-12-15T13:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.005234 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.005279 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.005291 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.005307 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.005319 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: E1215 13:55:44.017934 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:44Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.024416 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.024472 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.024496 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.024516 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.024528 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: E1215 13:55:44.040030 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:44Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.044633 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.044732 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.044744 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.044787 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.044801 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: E1215 13:55:44.058148 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:44Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.061883 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.061923 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.061936 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.061951 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.061962 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: E1215 13:55:44.076983 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:44Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.082047 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.082112 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.082128 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.082151 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.082169 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: E1215 13:55:44.098252 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-15T13:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"134a30f2-e02c-4026-a8fd-915d12b3ae90\\\",\\\"systemUUID\\\":\\\"2e6b0193-6ba1-4635-a26c-e50e20b7171c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-15T13:55:44Z is after 2025-08-24T17:21:41Z" Dec 15 13:55:44 crc kubenswrapper[4794]: E1215 13:55:44.098657 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.100844 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.100898 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.100917 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.100941 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.100959 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.203445 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.203491 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.203503 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.203521 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.203533 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.306718 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.306765 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.306777 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.306793 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.306804 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.409152 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.409204 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.409215 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.409233 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.409246 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.512186 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.512236 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.512252 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.512275 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.512301 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.614786 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.614824 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.614834 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.614850 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.614861 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.718162 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.718230 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.718249 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.718280 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.718299 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.820332 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.820389 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.820405 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.820430 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.820451 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.924679 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.924738 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.924758 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.924782 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:44 crc kubenswrapper[4794]: I1215 13:55:44.924801 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:44Z","lastTransitionTime":"2025-12-15T13:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.028024 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.028059 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.028071 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.028088 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.028102 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:45Z","lastTransitionTime":"2025-12-15T13:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.130224 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.130298 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.130324 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.130352 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.130374 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:45Z","lastTransitionTime":"2025-12-15T13:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.233140 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.233182 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.233193 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.233209 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.233220 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:45Z","lastTransitionTime":"2025-12-15T13:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.336854 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.336926 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.336949 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.336980 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.337003 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:45Z","lastTransitionTime":"2025-12-15T13:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.439701 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.439768 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.439801 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.439832 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.439854 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:45Z","lastTransitionTime":"2025-12-15T13:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.543147 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.543213 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.543236 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.543267 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.543290 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:45Z","lastTransitionTime":"2025-12-15T13:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.645402 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.645471 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.645489 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.645517 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.645536 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:45Z","lastTransitionTime":"2025-12-15T13:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.736497 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.736532 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.736613 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.736648 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:45 crc kubenswrapper[4794]: E1215 13:55:45.736835 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:45 crc kubenswrapper[4794]: E1215 13:55:45.737047 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:45 crc kubenswrapper[4794]: E1215 13:55:45.737141 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:45 crc kubenswrapper[4794]: E1215 13:55:45.737237 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.747786 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.747820 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.747831 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.747851 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.747862 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:45Z","lastTransitionTime":"2025-12-15T13:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.850753 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.850826 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.850848 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.850875 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.850898 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:45Z","lastTransitionTime":"2025-12-15T13:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.959940 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.960003 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.960021 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.960044 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:45 crc kubenswrapper[4794]: I1215 13:55:45.960062 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:45Z","lastTransitionTime":"2025-12-15T13:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.063043 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.063402 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.063602 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.063766 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.063918 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:46Z","lastTransitionTime":"2025-12-15T13:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.167493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.167568 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.167701 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.167736 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.167757 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:46Z","lastTransitionTime":"2025-12-15T13:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.270491 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.270751 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.270762 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.270779 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.270794 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:46Z","lastTransitionTime":"2025-12-15T13:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.374056 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.374169 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.374195 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.374227 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.374249 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:46Z","lastTransitionTime":"2025-12-15T13:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.476791 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.476845 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.476859 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.476877 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.476891 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:46Z","lastTransitionTime":"2025-12-15T13:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.580213 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.580272 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.580290 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.580315 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.580333 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:46Z","lastTransitionTime":"2025-12-15T13:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.682684 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.682765 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.682782 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.682808 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.682829 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:46Z","lastTransitionTime":"2025-12-15T13:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.786119 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.786210 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.786228 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.786810 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.786857 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:46Z","lastTransitionTime":"2025-12-15T13:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.890357 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.890435 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.890460 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.890483 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.890501 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:46Z","lastTransitionTime":"2025-12-15T13:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.994016 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.994105 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.994132 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.994164 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:46 crc kubenswrapper[4794]: I1215 13:55:46.994257 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:46Z","lastTransitionTime":"2025-12-15T13:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.098390 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.098486 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.098504 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.098528 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.098545 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:47Z","lastTransitionTime":"2025-12-15T13:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.201462 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.201556 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.201623 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.201652 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.201668 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:47Z","lastTransitionTime":"2025-12-15T13:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.304846 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.304954 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.304978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.305008 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.305028 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:47Z","lastTransitionTime":"2025-12-15T13:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.407914 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.407978 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.407997 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.408019 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.408037 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:47Z","lastTransitionTime":"2025-12-15T13:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.510977 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.511040 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.511053 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.511073 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.511086 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:47Z","lastTransitionTime":"2025-12-15T13:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.614263 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.614377 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.614395 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.614418 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.614434 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:47Z","lastTransitionTime":"2025-12-15T13:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.717219 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.717279 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.717298 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.717324 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.717342 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:47Z","lastTransitionTime":"2025-12-15T13:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.736761 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.736838 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.736771 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.737028 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:47 crc kubenswrapper[4794]: E1215 13:55:47.737160 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:47 crc kubenswrapper[4794]: E1215 13:55:47.737316 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:47 crc kubenswrapper[4794]: E1215 13:55:47.737463 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:47 crc kubenswrapper[4794]: E1215 13:55:47.737673 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.820828 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.820883 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.820892 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.820906 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.820936 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:47Z","lastTransitionTime":"2025-12-15T13:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.923408 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.923470 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.923489 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.923513 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:47 crc kubenswrapper[4794]: I1215 13:55:47.923530 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:47Z","lastTransitionTime":"2025-12-15T13:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.026150 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.026187 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.026196 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.026209 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.026219 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:48Z","lastTransitionTime":"2025-12-15T13:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.129917 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.129981 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.129998 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.130023 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.130040 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:48Z","lastTransitionTime":"2025-12-15T13:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.233273 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.233345 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.233364 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.233390 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.233411 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:48Z","lastTransitionTime":"2025-12-15T13:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.336566 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.336661 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.336681 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.336715 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.336736 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:48Z","lastTransitionTime":"2025-12-15T13:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.440132 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.440201 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.440219 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.440244 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.440262 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:48Z","lastTransitionTime":"2025-12-15T13:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.543139 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.543227 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.543250 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.543280 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.543297 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:48Z","lastTransitionTime":"2025-12-15T13:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.646040 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.646113 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.646137 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.646172 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.646197 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:48Z","lastTransitionTime":"2025-12-15T13:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.748958 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.749049 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.749073 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.749101 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.749124 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:48Z","lastTransitionTime":"2025-12-15T13:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.833359 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bpwnn" podStartSLOduration=91.833336077 podStartE2EDuration="1m31.833336077s" podCreationTimestamp="2025-12-15 13:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:48.83237817 +0000 UTC m=+110.684400678" watchObservedRunningTime="2025-12-15 13:55:48.833336077 +0000 UTC m=+110.685358555" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.848195 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=43.848170357 podStartE2EDuration="43.848170357s" podCreationTimestamp="2025-12-15 13:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:48.847309654 +0000 UTC m=+110.699332102" watchObservedRunningTime="2025-12-15 13:55:48.848170357 +0000 UTC m=+110.700192835" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.852016 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.852046 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.852054 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.852067 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.852075 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:48Z","lastTransitionTime":"2025-12-15T13:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.901064 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-t9nm7" podStartSLOduration=90.901036711 podStartE2EDuration="1m30.901036711s" podCreationTimestamp="2025-12-15 13:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:48.888563186 +0000 UTC m=+110.740585634" watchObservedRunningTime="2025-12-15 13:55:48.901036711 +0000 UTC m=+110.753059189" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.901495 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4d4wn" podStartSLOduration=89.901487174 podStartE2EDuration="1m29.901487174s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:48.900386863 +0000 UTC m=+110.752409311" watchObservedRunningTime="2025-12-15 13:55:48.901487174 +0000 UTC m=+110.753509642" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.924625 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=18.924610444 podStartE2EDuration="18.924610444s" podCreationTimestamp="2025-12-15 13:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:48.923760171 +0000 UTC m=+110.775782639" watchObservedRunningTime="2025-12-15 13:55:48.924610444 +0000 UTC m=+110.776632882" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.943506 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podStartSLOduration=90.943483327 podStartE2EDuration="1m30.943483327s" podCreationTimestamp="2025-12-15 13:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:48.943137917 +0000 UTC m=+110.795160365" watchObservedRunningTime="2025-12-15 13:55:48.943483327 +0000 UTC m=+110.795505795" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.955060 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.955091 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.955099 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.955116 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.955125 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:48Z","lastTransitionTime":"2025-12-15T13:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:48 crc kubenswrapper[4794]: I1215 13:55:48.983986 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6xvkj" podStartSLOduration=91.983959688 podStartE2EDuration="1m31.983959688s" podCreationTimestamp="2025-12-15 13:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:48.982793475 +0000 UTC m=+110.834815923" watchObservedRunningTime="2025-12-15 13:55:48.983959688 +0000 UTC m=+110.835982166" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.015641 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=55.015620894 podStartE2EDuration="55.015620894s" podCreationTimestamp="2025-12-15 13:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:49.001271817 +0000 UTC m=+110.853294325" watchObservedRunningTime="2025-12-15 13:55:49.015620894 +0000 UTC m=+110.867643342" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.057919 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.057968 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.057984 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.058005 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.058020 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:49Z","lastTransitionTime":"2025-12-15T13:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.064147 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cjbhj" podStartSLOduration=90.064124808 podStartE2EDuration="1m30.064124808s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:49.06386317 +0000 UTC m=+110.915885668" watchObservedRunningTime="2025-12-15 13:55:49.064124808 +0000 UTC m=+110.916147266" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.098548 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=94.09851406 podStartE2EDuration="1m34.09851406s" podCreationTimestamp="2025-12-15 13:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:49.081821738 +0000 UTC m=+110.933844216" watchObservedRunningTime="2025-12-15 13:55:49.09851406 +0000 UTC m=+110.950536538" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.160947 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.161021 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.161045 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.161076 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.161099 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:49Z","lastTransitionTime":"2025-12-15T13:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.264513 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.264568 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.264621 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.264651 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.264668 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:49Z","lastTransitionTime":"2025-12-15T13:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.367448 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.367897 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.368030 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.368197 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.368337 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:49Z","lastTransitionTime":"2025-12-15T13:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.471427 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.471539 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.471566 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.471645 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.471667 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:49Z","lastTransitionTime":"2025-12-15T13:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.574461 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.574514 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.574531 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.574553 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.574571 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:49Z","lastTransitionTime":"2025-12-15T13:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.677236 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.677292 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.677310 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.677332 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.677348 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:49Z","lastTransitionTime":"2025-12-15T13:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.736292 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.736388 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.736415 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.736495 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:49 crc kubenswrapper[4794]: E1215 13:55:49.737132 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:49 crc kubenswrapper[4794]: E1215 13:55:49.737324 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:49 crc kubenswrapper[4794]: E1215 13:55:49.737424 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:49 crc kubenswrapper[4794]: E1215 13:55:49.737519 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.779529 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.779631 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.779658 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.779686 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.779706 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:49Z","lastTransitionTime":"2025-12-15T13:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.883241 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.883367 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.883384 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.883409 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.883425 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:49Z","lastTransitionTime":"2025-12-15T13:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.986430 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.986476 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.986493 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.986515 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:49 crc kubenswrapper[4794]: I1215 13:55:49.986532 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:49Z","lastTransitionTime":"2025-12-15T13:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.089137 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.089198 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.089215 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.089240 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.089260 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:50Z","lastTransitionTime":"2025-12-15T13:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.192404 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.192458 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.192478 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.192503 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.192519 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:50Z","lastTransitionTime":"2025-12-15T13:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.295324 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.295399 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.295423 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.295452 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.295473 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:50Z","lastTransitionTime":"2025-12-15T13:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.398007 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.398071 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.398090 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.398110 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.398124 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:50Z","lastTransitionTime":"2025-12-15T13:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.501643 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.501723 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.501749 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.501780 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.501804 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:50Z","lastTransitionTime":"2025-12-15T13:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.604386 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.604479 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.604501 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.604533 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.604554 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:50Z","lastTransitionTime":"2025-12-15T13:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.707885 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.707940 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.707952 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.707966 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.707978 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:50Z","lastTransitionTime":"2025-12-15T13:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.809662 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.809716 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.809737 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.809762 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.809781 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:50Z","lastTransitionTime":"2025-12-15T13:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.912727 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.912784 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.912799 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.912822 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:50 crc kubenswrapper[4794]: I1215 13:55:50.912838 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:50Z","lastTransitionTime":"2025-12-15T13:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.016073 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.016130 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.016143 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.016164 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.016180 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:51Z","lastTransitionTime":"2025-12-15T13:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.119149 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.119226 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.119252 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.119283 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.119307 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:51Z","lastTransitionTime":"2025-12-15T13:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.222653 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.222720 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.222742 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.222774 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.222795 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:51Z","lastTransitionTime":"2025-12-15T13:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.326390 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.326558 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.326571 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.326602 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.326612 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:51Z","lastTransitionTime":"2025-12-15T13:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.429909 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.429997 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.430021 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.430050 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.430067 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:51Z","lastTransitionTime":"2025-12-15T13:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.533713 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.533781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.533797 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.533822 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.533839 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:51Z","lastTransitionTime":"2025-12-15T13:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.637090 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.637165 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.637190 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.637221 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.637243 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:51Z","lastTransitionTime":"2025-12-15T13:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.736143 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.736186 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.736186 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.736310 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:51 crc kubenswrapper[4794]: E1215 13:55:51.736490 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:51 crc kubenswrapper[4794]: E1215 13:55:51.736631 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:51 crc kubenswrapper[4794]: E1215 13:55:51.736795 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:51 crc kubenswrapper[4794]: E1215 13:55:51.736917 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.740358 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.740406 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.740425 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.740448 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.740464 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:51Z","lastTransitionTime":"2025-12-15T13:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.843633 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.843701 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.843723 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.843906 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.843935 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:51Z","lastTransitionTime":"2025-12-15T13:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.945833 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.945872 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.945883 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.945899 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:51 crc kubenswrapper[4794]: I1215 13:55:51.945912 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:51Z","lastTransitionTime":"2025-12-15T13:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.048704 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.048760 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.048778 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.048800 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.048817 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:52Z","lastTransitionTime":"2025-12-15T13:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.151781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.151844 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.151861 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.151884 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.151902 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:52Z","lastTransitionTime":"2025-12-15T13:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.254614 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.254669 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.254686 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.254711 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.254729 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:52Z","lastTransitionTime":"2025-12-15T13:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.358246 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.358325 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.358344 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.358373 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.358393 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:52Z","lastTransitionTime":"2025-12-15T13:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.462069 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.462137 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.462179 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.462212 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.462236 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:52Z","lastTransitionTime":"2025-12-15T13:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.564531 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.564652 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.564676 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.564707 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.564729 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:52Z","lastTransitionTime":"2025-12-15T13:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.667802 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.667877 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.667917 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.667955 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.667982 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:52Z","lastTransitionTime":"2025-12-15T13:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.770661 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.770760 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.770781 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.770803 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.770821 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:52Z","lastTransitionTime":"2025-12-15T13:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.873748 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.873804 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.873821 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.873841 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.873856 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:52Z","lastTransitionTime":"2025-12-15T13:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.978571 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.978696 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.978716 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.978745 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:52 crc kubenswrapper[4794]: I1215 13:55:52.978764 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:52Z","lastTransitionTime":"2025-12-15T13:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.081946 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.082020 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.082040 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.082065 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.082082 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:53Z","lastTransitionTime":"2025-12-15T13:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.185500 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.185641 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.185670 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.185706 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.185726 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:53Z","lastTransitionTime":"2025-12-15T13:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.288548 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.288637 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.288660 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.288686 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.288706 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:53Z","lastTransitionTime":"2025-12-15T13:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.391155 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.391223 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.391274 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.391309 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.391373 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:53Z","lastTransitionTime":"2025-12-15T13:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.494917 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.494992 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.495013 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.495039 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.495057 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:53Z","lastTransitionTime":"2025-12-15T13:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.598012 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.598070 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.598086 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.598107 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.598121 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:53Z","lastTransitionTime":"2025-12-15T13:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.701222 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.701279 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.701321 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.701345 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.701361 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:53Z","lastTransitionTime":"2025-12-15T13:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.737011 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.737108 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.737213 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.737287 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:53 crc kubenswrapper[4794]: E1215 13:55:53.737298 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:53 crc kubenswrapper[4794]: E1215 13:55:53.737386 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:53 crc kubenswrapper[4794]: E1215 13:55:53.737464 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:53 crc kubenswrapper[4794]: E1215 13:55:53.737903 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.738046 4794 scope.go:117] "RemoveContainer" containerID="a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c" Dec 15 13:55:53 crc kubenswrapper[4794]: E1215 13:55:53.738179 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.804163 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.804299 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.804333 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.804368 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.804389 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:53Z","lastTransitionTime":"2025-12-15T13:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.907166 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.907222 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.907240 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.907262 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:53 crc kubenswrapper[4794]: I1215 13:55:53.907279 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:53Z","lastTransitionTime":"2025-12-15T13:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.010291 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.010345 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.010365 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.010387 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.010407 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:54Z","lastTransitionTime":"2025-12-15T13:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.114542 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.114627 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.114641 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.114658 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.114672 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:54Z","lastTransitionTime":"2025-12-15T13:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.135179 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.135254 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.135280 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.135310 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.135334 4794 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-15T13:55:54Z","lastTransitionTime":"2025-12-15T13:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.201555 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=96.201536581 podStartE2EDuration="1m36.201536581s" podCreationTimestamp="2025-12-15 13:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:49.099373184 +0000 UTC m=+110.951395652" watchObservedRunningTime="2025-12-15 13:55:54.201536581 +0000 UTC m=+116.053559019" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.202764 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv"] Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.203155 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.208943 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.209528 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.209621 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.209833 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.293280 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85fc395b-0007-4b9f-82f1-48981b2731ac-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.293326 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85fc395b-0007-4b9f-82f1-48981b2731ac-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.293392 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/85fc395b-0007-4b9f-82f1-48981b2731ac-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.293415 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85fc395b-0007-4b9f-82f1-48981b2731ac-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.293453 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/85fc395b-0007-4b9f-82f1-48981b2731ac-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.394930 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/85fc395b-0007-4b9f-82f1-48981b2731ac-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.395024 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85fc395b-0007-4b9f-82f1-48981b2731ac-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.395121 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/85fc395b-0007-4b9f-82f1-48981b2731ac-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.395138 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/85fc395b-0007-4b9f-82f1-48981b2731ac-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.395190 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85fc395b-0007-4b9f-82f1-48981b2731ac-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.395240 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85fc395b-0007-4b9f-82f1-48981b2731ac-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.395254 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/85fc395b-0007-4b9f-82f1-48981b2731ac-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.396856 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85fc395b-0007-4b9f-82f1-48981b2731ac-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.413728 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85fc395b-0007-4b9f-82f1-48981b2731ac-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.428782 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85fc395b-0007-4b9f-82f1-48981b2731ac-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6swjv\" (UID: \"85fc395b-0007-4b9f-82f1-48981b2731ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:54 crc kubenswrapper[4794]: I1215 13:55:54.520621 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.390870 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t9nm7_0bc89ecc-eb8e-4926-bbb7-14c90f449e00/kube-multus/1.log" Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.391522 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t9nm7_0bc89ecc-eb8e-4926-bbb7-14c90f449e00/kube-multus/0.log" Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.391573 4794 generic.go:334] "Generic (PLEG): container finished" podID="0bc89ecc-eb8e-4926-bbb7-14c90f449e00" containerID="38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171" exitCode=1 Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.391675 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9nm7" event={"ID":"0bc89ecc-eb8e-4926-bbb7-14c90f449e00","Type":"ContainerDied","Data":"38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171"} Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.391713 4794 scope.go:117] "RemoveContainer" containerID="babf355cdc42b1b3ab521f8748b6e2d2f3df7794f08f211364f9af0e29a6ce95" Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.393081 4794 scope.go:117] "RemoveContainer" containerID="38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171" Dec 15 13:55:55 crc kubenswrapper[4794]: E1215 13:55:55.393448 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-t9nm7_openshift-multus(0bc89ecc-eb8e-4926-bbb7-14c90f449e00)\"" pod="openshift-multus/multus-t9nm7" podUID="0bc89ecc-eb8e-4926-bbb7-14c90f449e00" Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.402127 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" event={"ID":"85fc395b-0007-4b9f-82f1-48981b2731ac","Type":"ContainerStarted","Data":"b8215ff9dcd2a4619240c2b913caebffb9a0dafed2b5b601278ff15ecac134b3"} Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.402201 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" event={"ID":"85fc395b-0007-4b9f-82f1-48981b2731ac","Type":"ContainerStarted","Data":"df619ae3eae8706001e06f9a8e19d8130ef47734d1506e35e5c6b19671df0ac8"} Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.437908 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6swjv" podStartSLOduration=97.437885666 podStartE2EDuration="1m37.437885666s" podCreationTimestamp="2025-12-15 13:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:55:55.435048568 +0000 UTC m=+117.287071016" watchObservedRunningTime="2025-12-15 13:55:55.437885666 +0000 UTC m=+117.289908104" Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.737094 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.737117 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.737141 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:55 crc kubenswrapper[4794]: I1215 13:55:55.737091 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:55 crc kubenswrapper[4794]: E1215 13:55:55.737263 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:55 crc kubenswrapper[4794]: E1215 13:55:55.737490 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:55 crc kubenswrapper[4794]: E1215 13:55:55.737541 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:55 crc kubenswrapper[4794]: E1215 13:55:55.738023 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:56 crc kubenswrapper[4794]: I1215 13:55:56.408700 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t9nm7_0bc89ecc-eb8e-4926-bbb7-14c90f449e00/kube-multus/1.log" Dec 15 13:55:57 crc kubenswrapper[4794]: I1215 13:55:57.736277 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:57 crc kubenswrapper[4794]: I1215 13:55:57.736328 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:57 crc kubenswrapper[4794]: I1215 13:55:57.736352 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:57 crc kubenswrapper[4794]: I1215 13:55:57.736347 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:57 crc kubenswrapper[4794]: E1215 13:55:57.736475 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:55:57 crc kubenswrapper[4794]: E1215 13:55:57.736841 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:57 crc kubenswrapper[4794]: E1215 13:55:57.737083 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:57 crc kubenswrapper[4794]: E1215 13:55:57.737322 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:58 crc kubenswrapper[4794]: E1215 13:55:58.741918 4794 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 15 13:55:58 crc kubenswrapper[4794]: E1215 13:55:58.863976 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 13:55:59 crc kubenswrapper[4794]: I1215 13:55:59.736731 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:55:59 crc kubenswrapper[4794]: I1215 13:55:59.736810 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:55:59 crc kubenswrapper[4794]: E1215 13:55:59.736920 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:55:59 crc kubenswrapper[4794]: I1215 13:55:59.736810 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:55:59 crc kubenswrapper[4794]: E1215 13:55:59.737181 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:55:59 crc kubenswrapper[4794]: E1215 13:55:59.737254 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:55:59 crc kubenswrapper[4794]: I1215 13:55:59.738563 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:55:59 crc kubenswrapper[4794]: E1215 13:55:59.738938 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:01 crc kubenswrapper[4794]: I1215 13:56:01.736194 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:01 crc kubenswrapper[4794]: I1215 13:56:01.736228 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:01 crc kubenswrapper[4794]: E1215 13:56:01.736566 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:01 crc kubenswrapper[4794]: I1215 13:56:01.748990 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:01 crc kubenswrapper[4794]: E1215 13:56:01.738188 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:01 crc kubenswrapper[4794]: E1215 13:56:01.749198 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:01 crc kubenswrapper[4794]: I1215 13:56:01.749290 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:01 crc kubenswrapper[4794]: E1215 13:56:01.749958 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:03 crc kubenswrapper[4794]: I1215 13:56:03.747957 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:03 crc kubenswrapper[4794]: I1215 13:56:03.747960 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:03 crc kubenswrapper[4794]: I1215 13:56:03.748002 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:03 crc kubenswrapper[4794]: E1215 13:56:03.749240 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:03 crc kubenswrapper[4794]: E1215 13:56:03.749000 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:03 crc kubenswrapper[4794]: I1215 13:56:03.748031 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:03 crc kubenswrapper[4794]: E1215 13:56:03.749347 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:03 crc kubenswrapper[4794]: E1215 13:56:03.749650 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:03 crc kubenswrapper[4794]: E1215 13:56:03.865151 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 13:56:05 crc kubenswrapper[4794]: I1215 13:56:05.736311 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:05 crc kubenswrapper[4794]: I1215 13:56:05.736381 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:05 crc kubenswrapper[4794]: I1215 13:56:05.736302 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:05 crc kubenswrapper[4794]: E1215 13:56:05.736491 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:05 crc kubenswrapper[4794]: E1215 13:56:05.736680 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:05 crc kubenswrapper[4794]: I1215 13:56:05.736321 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:05 crc kubenswrapper[4794]: E1215 13:56:05.736748 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:05 crc kubenswrapper[4794]: E1215 13:56:05.736807 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:06 crc kubenswrapper[4794]: I1215 13:56:06.737165 4794 scope.go:117] "RemoveContainer" containerID="a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c" Dec 15 13:56:06 crc kubenswrapper[4794]: E1215 13:56:06.737611 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cwnfl_openshift-ovn-kubernetes(628fdda9-19ac-4a1d-a93b-82a10124a8ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" Dec 15 13:56:07 crc kubenswrapper[4794]: I1215 13:56:07.736925 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:07 crc kubenswrapper[4794]: I1215 13:56:07.736980 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:07 crc kubenswrapper[4794]: E1215 13:56:07.737108 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:07 crc kubenswrapper[4794]: I1215 13:56:07.737176 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:07 crc kubenswrapper[4794]: I1215 13:56:07.737182 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:07 crc kubenswrapper[4794]: E1215 13:56:07.737279 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:07 crc kubenswrapper[4794]: E1215 13:56:07.737433 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:07 crc kubenswrapper[4794]: E1215 13:56:07.737815 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:07 crc kubenswrapper[4794]: I1215 13:56:07.737968 4794 scope.go:117] "RemoveContainer" containerID="38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171" Dec 15 13:56:08 crc kubenswrapper[4794]: I1215 13:56:08.455619 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t9nm7_0bc89ecc-eb8e-4926-bbb7-14c90f449e00/kube-multus/1.log" Dec 15 13:56:08 crc kubenswrapper[4794]: I1215 13:56:08.455690 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9nm7" event={"ID":"0bc89ecc-eb8e-4926-bbb7-14c90f449e00","Type":"ContainerStarted","Data":"76c2abef58cd747a318832e9c9a8f53a0a66e4077a34e21078a3dc23196cbc26"} Dec 15 13:56:08 crc kubenswrapper[4794]: E1215 13:56:08.865777 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 13:56:09 crc kubenswrapper[4794]: I1215 13:56:09.737154 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:09 crc kubenswrapper[4794]: I1215 13:56:09.737285 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:09 crc kubenswrapper[4794]: I1215 13:56:09.737217 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:09 crc kubenswrapper[4794]: I1215 13:56:09.737176 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:09 crc kubenswrapper[4794]: E1215 13:56:09.737393 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:09 crc kubenswrapper[4794]: E1215 13:56:09.737687 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:09 crc kubenswrapper[4794]: E1215 13:56:09.737769 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:09 crc kubenswrapper[4794]: E1215 13:56:09.737885 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:11 crc kubenswrapper[4794]: I1215 13:56:11.736649 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:11 crc kubenswrapper[4794]: I1215 13:56:11.736692 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:11 crc kubenswrapper[4794]: E1215 13:56:11.736818 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:11 crc kubenswrapper[4794]: I1215 13:56:11.736894 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:11 crc kubenswrapper[4794]: I1215 13:56:11.736900 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:11 crc kubenswrapper[4794]: E1215 13:56:11.737070 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:11 crc kubenswrapper[4794]: E1215 13:56:11.737213 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:11 crc kubenswrapper[4794]: E1215 13:56:11.737399 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:13 crc kubenswrapper[4794]: I1215 13:56:13.736229 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:13 crc kubenswrapper[4794]: I1215 13:56:13.736244 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:13 crc kubenswrapper[4794]: I1215 13:56:13.736292 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:13 crc kubenswrapper[4794]: E1215 13:56:13.737618 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:13 crc kubenswrapper[4794]: E1215 13:56:13.737147 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:13 crc kubenswrapper[4794]: E1215 13:56:13.737413 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:13 crc kubenswrapper[4794]: I1215 13:56:13.736289 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:13 crc kubenswrapper[4794]: E1215 13:56:13.737797 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:13 crc kubenswrapper[4794]: E1215 13:56:13.867115 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 13:56:15 crc kubenswrapper[4794]: I1215 13:56:15.736520 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:15 crc kubenswrapper[4794]: I1215 13:56:15.736563 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:15 crc kubenswrapper[4794]: I1215 13:56:15.736568 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:15 crc kubenswrapper[4794]: I1215 13:56:15.736526 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:15 crc kubenswrapper[4794]: E1215 13:56:15.736741 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:15 crc kubenswrapper[4794]: E1215 13:56:15.736868 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:15 crc kubenswrapper[4794]: E1215 13:56:15.737224 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:15 crc kubenswrapper[4794]: E1215 13:56:15.737077 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:17 crc kubenswrapper[4794]: I1215 13:56:17.736771 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:17 crc kubenswrapper[4794]: I1215 13:56:17.736817 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:17 crc kubenswrapper[4794]: E1215 13:56:17.736919 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:17 crc kubenswrapper[4794]: E1215 13:56:17.737027 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:17 crc kubenswrapper[4794]: I1215 13:56:17.736821 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:17 crc kubenswrapper[4794]: I1215 13:56:17.737224 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:17 crc kubenswrapper[4794]: E1215 13:56:17.737300 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:17 crc kubenswrapper[4794]: E1215 13:56:17.737325 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:17 crc kubenswrapper[4794]: I1215 13:56:17.738446 4794 scope.go:117] "RemoveContainer" containerID="a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c" Dec 15 13:56:18 crc kubenswrapper[4794]: I1215 13:56:18.497508 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/3.log" Dec 15 13:56:18 crc kubenswrapper[4794]: I1215 13:56:18.500325 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerStarted","Data":"903c510f249c8b109fb6a118aad01f0358640936b98688be21968f0a1b3024ad"} Dec 15 13:56:18 crc kubenswrapper[4794]: I1215 13:56:18.501911 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:56:18 crc kubenswrapper[4794]: I1215 13:56:18.536194 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podStartSLOduration=119.536172519 podStartE2EDuration="1m59.536172519s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:18.533110387 +0000 UTC m=+140.385132855" watchObservedRunningTime="2025-12-15 13:56:18.536172519 +0000 UTC m=+140.388194997" Dec 15 13:56:18 crc kubenswrapper[4794]: I1215 13:56:18.706205 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4xt6f"] Dec 15 13:56:18 crc kubenswrapper[4794]: I1215 13:56:18.706293 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:18 crc kubenswrapper[4794]: E1215 13:56:18.706408 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:18 crc kubenswrapper[4794]: E1215 13:56:18.867734 4794 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 13:56:19 crc kubenswrapper[4794]: I1215 13:56:19.737025 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:19 crc kubenswrapper[4794]: I1215 13:56:19.737069 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:19 crc kubenswrapper[4794]: I1215 13:56:19.737432 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:19 crc kubenswrapper[4794]: E1215 13:56:19.737901 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:19 crc kubenswrapper[4794]: E1215 13:56:19.738031 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:19 crc kubenswrapper[4794]: E1215 13:56:19.738085 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:20 crc kubenswrapper[4794]: I1215 13:56:20.736892 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:20 crc kubenswrapper[4794]: E1215 13:56:20.737341 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:21 crc kubenswrapper[4794]: I1215 13:56:21.736720 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:21 crc kubenswrapper[4794]: I1215 13:56:21.736753 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:21 crc kubenswrapper[4794]: E1215 13:56:21.736833 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:21 crc kubenswrapper[4794]: I1215 13:56:21.736720 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:21 crc kubenswrapper[4794]: E1215 13:56:21.736905 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:21 crc kubenswrapper[4794]: E1215 13:56:21.736955 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:22 crc kubenswrapper[4794]: I1215 13:56:22.736889 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:22 crc kubenswrapper[4794]: E1215 13:56:22.737016 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xt6f" podUID="1f6d5d8e-1512-4d71-8363-ba6003bf10b6" Dec 15 13:56:23 crc kubenswrapper[4794]: I1215 13:56:23.736339 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:23 crc kubenswrapper[4794]: I1215 13:56:23.736421 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:23 crc kubenswrapper[4794]: I1215 13:56:23.736363 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:23 crc kubenswrapper[4794]: E1215 13:56:23.736522 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:56:23 crc kubenswrapper[4794]: E1215 13:56:23.736807 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:56:23 crc kubenswrapper[4794]: E1215 13:56:23.736917 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.534028 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.534107 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.625092 4794 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.682042 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.682947 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.685392 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqpgr"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.686335 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.693024 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qmz6h"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.693236 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.695496 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.695789 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.700557 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.712079 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.712261 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.712315 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.712275 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.712518 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.712661 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.712662 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.712746 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.712521 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.712887 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.713183 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.713504 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2rt5j"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.713828 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.713860 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.713977 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.714232 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.714464 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.714720 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.719943 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8fxqs"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.728794 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.729370 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.731321 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.731695 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.731775 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.731776 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.731829 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.731933 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.732019 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.732148 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.732284 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.732320 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.732415 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.732503 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.732612 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.732780 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.732973 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.733244 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.733416 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.734737 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-69q8p"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.735423 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.735869 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-824w4"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.736257 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.736371 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.736896 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqpgr"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.737025 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.737040 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.737104 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.737436 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-824w4" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.739778 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.741290 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.750106 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.750331 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.750479 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.751670 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.751913 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.752280 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.753033 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5m27d"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.757956 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.758620 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.758935 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.759291 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.759402 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.759484 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.760965 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.768899 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.769994 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770490 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770527 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770553 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/057601fe-03b1-48d4-8bbc-4482f393d6cb-audit-policies\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770576 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0d34d344-13c1-4816-8286-2104852b248b-images\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770645 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770672 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa75386-467c-467e-8ef7-27c65cd6a2b5-serving-cert\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770704 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt2pn\" (UniqueName: \"kubernetes.io/projected/23e4e919-2929-484f-afe8-e80ddc566e7c-kube-api-access-vt2pn\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770725 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd8f05be-d624-4ca1-bd92-658fa0a768d6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-v76sx\" (UID: \"bd8f05be-d624-4ca1-bd92-658fa0a768d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770749 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770769 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjqnk\" (UniqueName: \"kubernetes.io/projected/6fa75386-467c-467e-8ef7-27c65cd6a2b5-kube-api-access-bjqnk\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770789 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-image-import-ca\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770814 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-serving-cert\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770834 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8x65\" (UniqueName: \"kubernetes.io/projected/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-kube-api-access-b8x65\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770857 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk6lx\" (UniqueName: \"kubernetes.io/projected/0d34d344-13c1-4816-8286-2104852b248b-kube-api-access-bk6lx\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770877 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/acf556c6-8a5a-4980-b07d-28939b2246ee-etcd-client\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770889 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770974 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.771152 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.771357 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pdnn8"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.771725 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772136 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5m27d" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.770897 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/acf556c6-8a5a-4980-b07d-28939b2246ee-node-pullsecrets\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772418 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/057601fe-03b1-48d4-8bbc-4482f393d6cb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772450 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7fjt\" (UniqueName: \"kubernetes.io/projected/e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331-kube-api-access-x7fjt\") pod \"dns-operator-744455d44c-824w4\" (UID: \"e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331\") " pod="openshift-dns-operator/dns-operator-744455d44c-824w4" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772477 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4-serving-cert\") pod \"openshift-config-operator-7777fb866f-bpkkk\" (UID: \"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772507 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772528 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbmqh\" (UniqueName: \"kubernetes.io/projected/28dac958-62ff-4d38-9bf6-a86fa57fb772-kube-api-access-nbmqh\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772547 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d34d344-13c1-4816-8286-2104852b248b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772570 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-config\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772609 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bwb7\" (UniqueName: \"kubernetes.io/projected/057601fe-03b1-48d4-8bbc-4482f393d6cb-kube-api-access-5bwb7\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772650 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-dir\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772671 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d34d344-13c1-4816-8286-2104852b248b-config\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772692 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772713 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057601fe-03b1-48d4-8bbc-4482f393d6cb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772747 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772767 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772792 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331-metrics-tls\") pod \"dns-operator-744455d44c-824w4\" (UID: \"e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331\") " pod="openshift-dns-operator/dns-operator-744455d44c-824w4" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772813 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jw4r\" (UniqueName: \"kubernetes.io/projected/acf556c6-8a5a-4980-b07d-28939b2246ee-kube-api-access-6jw4r\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772841 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm87p\" (UniqueName: \"kubernetes.io/projected/bd8f05be-d624-4ca1-bd92-658fa0a768d6-kube-api-access-nm87p\") pod \"openshift-apiserver-operator-796bbdcf4f-v76sx\" (UID: \"bd8f05be-d624-4ca1-bd92-658fa0a768d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772861 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-trusted-ca-bundle\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772887 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772908 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-audit\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772932 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-auth-proxy-config\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772961 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-config\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.772985 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-etcd-serving-ca\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773003 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf556c6-8a5a-4980-b07d-28939b2246ee-serving-cert\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773026 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-service-ca-bundle\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773052 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773073 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-config\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773092 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bpkkk\" (UID: \"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773112 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kl5b\" (UniqueName: \"kubernetes.io/projected/6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4-kube-api-access-2kl5b\") pod \"openshift-config-operator-7777fb866f-bpkkk\" (UID: \"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773136 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23e4e919-2929-484f-afe8-e80ddc566e7c-serving-cert\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773158 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/057601fe-03b1-48d4-8bbc-4482f393d6cb-encryption-config\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773187 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8f05be-d624-4ca1-bd92-658fa0a768d6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-v76sx\" (UID: \"bd8f05be-d624-4ca1-bd92-658fa0a768d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773212 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-config\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773241 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773262 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773283 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/057601fe-03b1-48d4-8bbc-4482f393d6cb-serving-cert\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773304 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/057601fe-03b1-48d4-8bbc-4482f393d6cb-audit-dir\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773324 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/acf556c6-8a5a-4980-b07d-28939b2246ee-audit-dir\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773345 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-machine-approver-tls\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773370 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-client-ca\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773393 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/057601fe-03b1-48d4-8bbc-4482f393d6cb-etcd-client\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773412 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-config\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773433 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-client-ca\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773453 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-policies\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773471 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/acf556c6-8a5a-4980-b07d-28939b2246ee-encryption-config\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773491 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773510 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87477\" (UniqueName: \"kubernetes.io/projected/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-kube-api-access-87477\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.773628 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.774204 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.774721 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.776952 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.777000 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.789678 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rfx4t"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.791088 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.792683 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.793244 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.793734 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.795102 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.796836 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.797330 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.799020 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.800097 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q5mq8"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.801036 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.808191 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.808559 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.809170 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.810621 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.813194 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.813244 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.813321 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.813391 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.813422 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.813432 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.813529 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.813531 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.813571 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.814310 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.814519 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.814732 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.814847 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.814885 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.814889 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.814987 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.815016 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.815059 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.815136 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.815291 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.815345 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.815291 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.815382 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.815517 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.816639 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.816709 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.816778 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.816981 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.817151 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.817209 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.817243 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.817335 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.817771 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.817886 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.819543 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-wd9tc"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.819898 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.821015 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.821313 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.821422 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.821502 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.824600 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.824725 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8fxqs"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.827005 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.828493 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.829795 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.830316 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.830609 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.831225 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.832116 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.832703 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.833282 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qmz6h"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.834157 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.834257 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.835741 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-shzgd"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.836427 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.836807 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.837223 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.839841 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.840642 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.846218 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.846476 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.847246 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.848647 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.849064 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.849204 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.851907 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2rt5j"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.854836 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5qmc9"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.856266 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.859634 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-824w4"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.860147 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.862661 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.863725 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.867844 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874244 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874342 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-config\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874368 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bpkkk\" (UID: \"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874384 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kl5b\" (UniqueName: \"kubernetes.io/projected/6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4-kube-api-access-2kl5b\") pod \"openshift-config-operator-7777fb866f-bpkkk\" (UID: \"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874399 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23e4e919-2929-484f-afe8-e80ddc566e7c-serving-cert\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874414 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/057601fe-03b1-48d4-8bbc-4482f393d6cb-encryption-config\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874437 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8f05be-d624-4ca1-bd92-658fa0a768d6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-v76sx\" (UID: \"bd8f05be-d624-4ca1-bd92-658fa0a768d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874452 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-config\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874497 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874512 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874527 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/057601fe-03b1-48d4-8bbc-4482f393d6cb-serving-cert\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874541 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/057601fe-03b1-48d4-8bbc-4482f393d6cb-audit-dir\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874557 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/acf556c6-8a5a-4980-b07d-28939b2246ee-audit-dir\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874573 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-machine-approver-tls\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874603 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-client-ca\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874621 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/057601fe-03b1-48d4-8bbc-4482f393d6cb-etcd-client\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874637 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-config\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874653 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-policies\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874669 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-client-ca\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874686 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874702 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/acf556c6-8a5a-4980-b07d-28939b2246ee-encryption-config\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874718 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87477\" (UniqueName: \"kubernetes.io/projected/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-kube-api-access-87477\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874732 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874746 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874760 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/057601fe-03b1-48d4-8bbc-4482f393d6cb-audit-policies\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874776 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0d34d344-13c1-4816-8286-2104852b248b-images\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874791 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874807 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa75386-467c-467e-8ef7-27c65cd6a2b5-serving-cert\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874824 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874839 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt2pn\" (UniqueName: \"kubernetes.io/projected/23e4e919-2929-484f-afe8-e80ddc566e7c-kube-api-access-vt2pn\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874855 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd8f05be-d624-4ca1-bd92-658fa0a768d6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-v76sx\" (UID: \"bd8f05be-d624-4ca1-bd92-658fa0a768d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874872 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjqnk\" (UniqueName: \"kubernetes.io/projected/6fa75386-467c-467e-8ef7-27c65cd6a2b5-kube-api-access-bjqnk\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874888 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-image-import-ca\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874906 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8x65\" (UniqueName: \"kubernetes.io/projected/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-kube-api-access-b8x65\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874924 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-serving-cert\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874942 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/acf556c6-8a5a-4980-b07d-28939b2246ee-etcd-client\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874962 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk6lx\" (UniqueName: \"kubernetes.io/projected/0d34d344-13c1-4816-8286-2104852b248b-kube-api-access-bk6lx\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.874985 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdf86acc-2500-434f-963d-2aaa9219dfa1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-j6274\" (UID: \"fdf86acc-2500-434f-963d-2aaa9219dfa1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875011 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/acf556c6-8a5a-4980-b07d-28939b2246ee-node-pullsecrets\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875032 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/057601fe-03b1-48d4-8bbc-4482f393d6cb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875054 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdf86acc-2500-434f-963d-2aaa9219dfa1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-j6274\" (UID: \"fdf86acc-2500-434f-963d-2aaa9219dfa1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875079 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4-serving-cert\") pod \"openshift-config-operator-7777fb866f-bpkkk\" (UID: \"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875125 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7fjt\" (UniqueName: \"kubernetes.io/projected/e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331-kube-api-access-x7fjt\") pod \"dns-operator-744455d44c-824w4\" (UID: \"e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331\") " pod="openshift-dns-operator/dns-operator-744455d44c-824w4" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875152 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d34d344-13c1-4816-8286-2104852b248b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875175 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875199 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbmqh\" (UniqueName: \"kubernetes.io/projected/28dac958-62ff-4d38-9bf6-a86fa57fb772-kube-api-access-nbmqh\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875222 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-config\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875239 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bwb7\" (UniqueName: \"kubernetes.io/projected/057601fe-03b1-48d4-8bbc-4482f393d6cb-kube-api-access-5bwb7\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875255 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-dir\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875271 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d34d344-13c1-4816-8286-2104852b248b-config\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875286 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875303 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057601fe-03b1-48d4-8bbc-4482f393d6cb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875328 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875348 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875369 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jw4r\" (UniqueName: \"kubernetes.io/projected/acf556c6-8a5a-4980-b07d-28939b2246ee-kube-api-access-6jw4r\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875390 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331-metrics-tls\") pod \"dns-operator-744455d44c-824w4\" (UID: \"e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331\") " pod="openshift-dns-operator/dns-operator-744455d44c-824w4" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875414 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7dvg\" (UniqueName: \"kubernetes.io/projected/fdf86acc-2500-434f-963d-2aaa9219dfa1-kube-api-access-z7dvg\") pod \"openshift-controller-manager-operator-756b6f6bc6-j6274\" (UID: \"fdf86acc-2500-434f-963d-2aaa9219dfa1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875447 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm87p\" (UniqueName: \"kubernetes.io/projected/bd8f05be-d624-4ca1-bd92-658fa0a768d6-kube-api-access-nm87p\") pod \"openshift-apiserver-operator-796bbdcf4f-v76sx\" (UID: \"bd8f05be-d624-4ca1-bd92-658fa0a768d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875463 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-trusted-ca-bundle\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875481 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875496 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-audit\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875513 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-auth-proxy-config\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875528 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf556c6-8a5a-4980-b07d-28939b2246ee-serving-cert\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875543 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-config\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875638 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-etcd-serving-ca\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875725 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-service-ca-bundle\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.875902 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-client-ca\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.876475 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-service-ca-bundle\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.876998 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.877303 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/057601fe-03b1-48d4-8bbc-4482f393d6cb-audit-dir\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.877566 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-client-ca\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.877707 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.880727 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/acf556c6-8a5a-4980-b07d-28939b2246ee-audit-dir\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.880953 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.881719 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/acf556c6-8a5a-4980-b07d-28939b2246ee-encryption-config\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.881838 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/057601fe-03b1-48d4-8bbc-4482f393d6cb-serving-cert\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.882327 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-config\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.882331 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/057601fe-03b1-48d4-8bbc-4482f393d6cb-encryption-config\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.883334 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8f05be-d624-4ca1-bd92-658fa0a768d6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-v76sx\" (UID: \"bd8f05be-d624-4ca1-bd92-658fa0a768d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.883458 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-config\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.883758 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bpkkk\" (UID: \"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.883817 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kpl54"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.884259 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-policies\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.884504 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.884987 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d34d344-13c1-4816-8286-2104852b248b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.885048 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/acf556c6-8a5a-4980-b07d-28939b2246ee-node-pullsecrets\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.884978 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-config\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.885516 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/057601fe-03b1-48d4-8bbc-4482f393d6cb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.886542 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23e4e919-2929-484f-afe8-e80ddc566e7c-serving-cert\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.886689 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.886891 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.887157 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.887852 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d34d344-13c1-4816-8286-2104852b248b-config\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.888382 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-config\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.888532 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-dir\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.888569 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4-serving-cert\") pod \"openshift-config-operator-7777fb866f-bpkkk\" (UID: \"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.888806 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.888809 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-machine-approver-tls\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.889023 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057601fe-03b1-48d4-8bbc-4482f393d6cb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.889306 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.890104 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.890547 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0d34d344-13c1-4816-8286-2104852b248b-images\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.890962 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-auth-proxy-config\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.891049 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/057601fe-03b1-48d4-8bbc-4482f393d6cb-audit-policies\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.891408 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331-metrics-tls\") pod \"dns-operator-744455d44c-824w4\" (UID: \"e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331\") " pod="openshift-dns-operator/dns-operator-744455d44c-824w4" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.892221 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-audit\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.892619 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-serving-cert\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.892864 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.893534 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.894924 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.894974 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.895001 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa75386-467c-467e-8ef7-27c65cd6a2b5-serving-cert\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.896136 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-image-import-ca\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.896389 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-etcd-serving-ca\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.896897 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-config\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.897244 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf556c6-8a5a-4980-b07d-28939b2246ee-trusted-ca-bundle\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.899055 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/acf556c6-8a5a-4980-b07d-28939b2246ee-etcd-client\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.906438 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.908973 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf556c6-8a5a-4980-b07d-28939b2246ee-serving-cert\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.910024 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/057601fe-03b1-48d4-8bbc-4482f393d6cb-etcd-client\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.913399 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.915170 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.917250 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd8f05be-d624-4ca1-bd92-658fa0a768d6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-v76sx\" (UID: \"bd8f05be-d624-4ca1-bd92-658fa0a768d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.925222 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.929256 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rfx4t"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.929442 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.929498 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-69q8p"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.929522 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5m27d"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.929728 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.929745 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.929755 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-shzgd"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.930716 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.931276 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.931646 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.932568 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tvkw5"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.933226 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.933558 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-59wlr"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.933952 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.934690 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vvw2l"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.935399 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.935416 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.937304 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pdnn8"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.937655 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.938772 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.939553 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.941469 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.942610 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.943153 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.943689 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.944465 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.944524 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.944980 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.945645 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q5mq8"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.946598 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.947063 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.947710 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.948658 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.950246 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.951221 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.952275 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.953184 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tvkw5"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.954133 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.955097 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.956014 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.957003 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kpl54"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.960301 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.962839 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.964119 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n7hgc"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.966117 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.966652 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pdzrx"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.968036 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pdzrx" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.969789 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5qmc9"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.971401 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-59wlr"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.972374 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n7hgc"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.973177 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.974886 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.976042 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vvw2l"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.976285 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdf86acc-2500-434f-963d-2aaa9219dfa1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-j6274\" (UID: \"fdf86acc-2500-434f-963d-2aaa9219dfa1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.976349 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7dvg\" (UniqueName: \"kubernetes.io/projected/fdf86acc-2500-434f-963d-2aaa9219dfa1-kube-api-access-z7dvg\") pod \"openshift-controller-manager-operator-756b6f6bc6-j6274\" (UID: \"fdf86acc-2500-434f-963d-2aaa9219dfa1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.976449 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdf86acc-2500-434f-963d-2aaa9219dfa1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-j6274\" (UID: \"fdf86acc-2500-434f-963d-2aaa9219dfa1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.977033 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdf86acc-2500-434f-963d-2aaa9219dfa1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-j6274\" (UID: \"fdf86acc-2500-434f-963d-2aaa9219dfa1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.977157 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pdzrx"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.978403 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-jpcm8"] Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.979104 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jpcm8" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.982312 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdf86acc-2500-434f-963d-2aaa9219dfa1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-j6274\" (UID: \"fdf86acc-2500-434f-963d-2aaa9219dfa1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" Dec 15 13:56:24 crc kubenswrapper[4794]: I1215 13:56:24.985777 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.005081 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.024719 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.045203 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.069817 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.087416 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.104703 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.125156 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.145214 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.165141 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.185149 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.204935 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.246261 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.265255 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.286191 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.306130 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.325690 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.346336 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.366023 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.386275 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.405770 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.426157 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.445886 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.465854 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.486302 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.506486 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.525433 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.545686 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.565697 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.585433 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.607692 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.626923 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.646444 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.665620 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.685956 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.705822 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.724938 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.736291 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.736341 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.736396 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.745872 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.765518 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.786364 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.796430 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:25 crc kubenswrapper[4794]: E1215 13:56:25.796630 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:58:27.796557062 +0000 UTC m=+269.648579550 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.796699 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.796884 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.796941 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.797071 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.805073 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.845157 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.863677 4794 request.go:700] Waited for 1.014097016s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-dockercfg-k9rxt&limit=500&resourceVersion=0 Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.865116 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.885855 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.905777 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.926160 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.945633 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 15 13:56:25 crc kubenswrapper[4794]: I1215 13:56:25.985692 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kl5b\" (UniqueName: \"kubernetes.io/projected/6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4-kube-api-access-2kl5b\") pod \"openshift-config-operator-7777fb866f-bpkkk\" (UID: \"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.006851 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.014303 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87477\" (UniqueName: \"kubernetes.io/projected/481f059a-4b43-4cbe-989b-bec3c0b9b9fb-kube-api-access-87477\") pod \"authentication-operator-69f744f599-8fxqs\" (UID: \"481f059a-4b43-4cbe-989b-bec3c0b9b9fb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.025894 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.046040 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.050992 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.055722 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.091002 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jw4r\" (UniqueName: \"kubernetes.io/projected/acf556c6-8a5a-4980-b07d-28939b2246ee-kube-api-access-6jw4r\") pod \"apiserver-76f77b778f-69q8p\" (UID: \"acf556c6-8a5a-4980-b07d-28939b2246ee\") " pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.113187 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8x65\" (UniqueName: \"kubernetes.io/projected/500b8fbe-cdab-46ef-8c4a-bda78ba3ed37-kube-api-access-b8x65\") pod \"machine-approver-56656f9798-hsk8w\" (UID: \"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.132359 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.138301 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbmqh\" (UniqueName: \"kubernetes.io/projected/28dac958-62ff-4d38-9bf6-a86fa57fb772-kube-api-access-nbmqh\") pod \"oauth-openshift-558db77b4-2rt5j\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.144944 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bwb7\" (UniqueName: \"kubernetes.io/projected/057601fe-03b1-48d4-8bbc-4482f393d6cb-kube-api-access-5bwb7\") pod \"apiserver-7bbb656c7d-fnkmj\" (UID: \"057601fe-03b1-48d4-8bbc-4482f393d6cb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:26 crc kubenswrapper[4794]: W1215 13:56:26.159086 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod500b8fbe_cdab_46ef_8c4a_bda78ba3ed37.slice/crio-52c4469c94624b36ae651878dfac7bd6775c0cf985c76354423d67bc6a5abd28 WatchSource:0}: Error finding container 52c4469c94624b36ae651878dfac7bd6775c0cf985c76354423d67bc6a5abd28: Status 404 returned error can't find the container with id 52c4469c94624b36ae651878dfac7bd6775c0cf985c76354423d67bc6a5abd28 Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.165207 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7fjt\" (UniqueName: \"kubernetes.io/projected/e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331-kube-api-access-x7fjt\") pod \"dns-operator-744455d44c-824w4\" (UID: \"e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331\") " pod="openshift-dns-operator/dns-operator-744455d44c-824w4" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.186730 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk6lx\" (UniqueName: \"kubernetes.io/projected/0d34d344-13c1-4816-8286-2104852b248b-kube-api-access-bk6lx\") pod \"machine-api-operator-5694c8668f-qmz6h\" (UID: \"0d34d344-13c1-4816-8286-2104852b248b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.221350 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm87p\" (UniqueName: \"kubernetes.io/projected/bd8f05be-d624-4ca1-bd92-658fa0a768d6-kube-api-access-nm87p\") pod \"openshift-apiserver-operator-796bbdcf4f-v76sx\" (UID: \"bd8f05be-d624-4ca1-bd92-658fa0a768d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.226222 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjqnk\" (UniqueName: \"kubernetes.io/projected/6fa75386-467c-467e-8ef7-27c65cd6a2b5-kube-api-access-bjqnk\") pod \"controller-manager-879f6c89f-cqpgr\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.239436 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt2pn\" (UniqueName: \"kubernetes.io/projected/23e4e919-2929-484f-afe8-e80ddc566e7c-kube-api-access-vt2pn\") pod \"route-controller-manager-6576b87f9c-824mb\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.245437 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.252810 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.268331 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.285733 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.285857 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk"] Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.297001 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.305406 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 15 13:56:26 crc kubenswrapper[4794]: W1215 13:56:26.315440 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d4f0599_ac2b_4b4e_b1d9_a7a813babfe4.slice/crio-a64a156d3bc92e9acac86c84c34d27672893533e0e2ba069a422cfb77680d89a WatchSource:0}: Error finding container a64a156d3bc92e9acac86c84c34d27672893533e0e2ba069a422cfb77680d89a: Status 404 returned error can't find the container with id a64a156d3bc92e9acac86c84c34d27672893533e0e2ba069a422cfb77680d89a Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.317012 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.322315 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.323489 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8fxqs"] Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.325114 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.331990 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:26 crc kubenswrapper[4794]: W1215 13:56:26.341005 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod481f059a_4b43_4cbe_989b_bec3c0b9b9fb.slice/crio-5e94e0781b702548da67e2a7614cdd72f206e32abd5f21046bec282f24bd4799 WatchSource:0}: Error finding container 5e94e0781b702548da67e2a7614cdd72f206e32abd5f21046bec282f24bd4799: Status 404 returned error can't find the container with id 5e94e0781b702548da67e2a7614cdd72f206e32abd5f21046bec282f24bd4799 Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.345675 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.365938 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.371440 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.380955 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-824w4" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.385743 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.420379 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.425894 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.435232 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqpgr"] Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.445176 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.465011 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.486240 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.501715 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qmz6h"] Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.511896 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.526435 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.533002 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.537268 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" event={"ID":"6fa75386-467c-467e-8ef7-27c65cd6a2b5","Type":"ContainerStarted","Data":"fcc4c989808b6d646710b4b37cf36b67e7746f75d8fa18b86532d34b76e193a2"} Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.546806 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.547206 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" event={"ID":"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37","Type":"ContainerStarted","Data":"122b1d165cac5c32f39b9145bab19ba91ead77c06aed0abf580d5cd22c60f9c3"} Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.547244 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" event={"ID":"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37","Type":"ContainerStarted","Data":"52c4469c94624b36ae651878dfac7bd6775c0cf985c76354423d67bc6a5abd28"} Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.550180 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" event={"ID":"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4","Type":"ContainerStarted","Data":"509b67487d471c5702ad2854e28a08309d75996f544c3a87ae57bc5fc6712eba"} Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.550208 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" event={"ID":"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4","Type":"ContainerStarted","Data":"a64a156d3bc92e9acac86c84c34d27672893533e0e2ba069a422cfb77680d89a"} Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.555321 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" event={"ID":"481f059a-4b43-4cbe-989b-bec3c0b9b9fb","Type":"ContainerStarted","Data":"e930d471ada8f6cab41895e9c92e4ee4526412a0ed0e9500b9082386bbcb262e"} Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.555355 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" event={"ID":"481f059a-4b43-4cbe-989b-bec3c0b9b9fb","Type":"ContainerStarted","Data":"5e94e0781b702548da67e2a7614cdd72f206e32abd5f21046bec282f24bd4799"} Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.565734 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.586664 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.606304 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.625845 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.634052 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-824w4"] Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.644338 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2rt5j"] Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.645234 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.665821 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.686033 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.688834 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-69q8p"] Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.706468 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.725852 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.745353 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.765985 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.776888 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx"] Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.785364 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.789847 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj"] Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.791805 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb"] Dec 15 13:56:26 crc kubenswrapper[4794]: E1215 13:56:26.798695 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 15 13:56:26 crc kubenswrapper[4794]: E1215 13:56:26.798772 4794 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 15 13:56:26 crc kubenswrapper[4794]: E1215 13:56:26.798792 4794 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Dec 15 13:56:26 crc kubenswrapper[4794]: E1215 13:56:26.798776 4794 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Dec 15 13:56:26 crc kubenswrapper[4794]: E1215 13:56:26.798893 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:58:28.79886715 +0000 UTC m=+270.650889588 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Dec 15 13:56:26 crc kubenswrapper[4794]: E1215 13:56:26.798919 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-15 13:58:28.798907872 +0000 UTC m=+270.650930420 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Dec 15 13:56:26 crc kubenswrapper[4794]: W1215 13:56:26.801593 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod057601fe_03b1_48d4_8bbc_4482f393d6cb.slice/crio-fcf6469a1af535adbca9a42b59e08f482a07898d4ca8c82c477cfcf029001c45 WatchSource:0}: Error finding container fcf6469a1af535adbca9a42b59e08f482a07898d4ca8c82c477cfcf029001c45: Status 404 returned error can't find the container with id fcf6469a1af535adbca9a42b59e08f482a07898d4ca8c82c477cfcf029001c45 Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.805716 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: W1215 13:56:26.809809 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e4e919_2929_484f_afe8_e80ddc566e7c.slice/crio-bd7235065177f7298a6ef0af23b253891bdbc3dc161b5e6bc955576642eed1a8 WatchSource:0}: Error finding container bd7235065177f7298a6ef0af23b253891bdbc3dc161b5e6bc955576642eed1a8: Status 404 returned error can't find the container with id bd7235065177f7298a6ef0af23b253891bdbc3dc161b5e6bc955576642eed1a8 Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.825356 4794 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.845637 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.864817 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.884326 4794 request.go:700] Waited for 1.916067079s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.885965 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.909114 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.925363 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.964850 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7dvg\" (UniqueName: \"kubernetes.io/projected/fdf86acc-2500-434f-963d-2aaa9219dfa1-kube-api-access-z7dvg\") pod \"openshift-controller-manager-operator-756b6f6bc6-j6274\" (UID: \"fdf86acc-2500-434f-963d-2aaa9219dfa1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.968111 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 15 13:56:26 crc kubenswrapper[4794]: I1215 13:56:26.985554 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.005817 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.045231 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.063322 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.067672 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.084992 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.105996 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.109328 4794 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.109452 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-15 13:58:29.109427284 +0000 UTC m=+270.961449722 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.109444 4794 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.109625 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-15 13:58:29.109554198 +0000 UTC m=+270.961576716 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217481 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-config\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217515 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08faaabe-18ef-488b-b8d9-d4709136bcd6-trusted-ca\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217614 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c235fb96-c626-4949-bd6e-a21dd37bc9d1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217639 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-service-ca-bundle\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217701 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29c2c126-2c52-4332-b622-a73f6b7f09c1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dwkxc\" (UID: \"29c2c126-2c52-4332-b622-a73f6b7f09c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217726 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d01d215-bf1a-4d62-8584-d15315cb4675-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217751 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d46051-64c9-4b63-b3de-c50f8943a37c-serving-cert\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217807 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d01d215-bf1a-4d62-8584-d15315cb4675-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217852 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-trusted-ca-bundle\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217875 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c370085-8ba2-4be8-9290-ff2f37189fae-config\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217910 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-bound-sa-token\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217945 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29c2c126-2c52-4332-b622-a73f6b7f09c1-config\") pod \"kube-apiserver-operator-766d6c64bb-dwkxc\" (UID: \"29c2c126-2c52-4332-b622-a73f6b7f09c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217969 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzn5x\" (UniqueName: \"kubernetes.io/projected/2c370085-8ba2-4be8-9290-ff2f37189fae-kube-api-access-nzn5x\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.217990 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfbzs\" (UniqueName: \"kubernetes.io/projected/2d01d215-bf1a-4d62-8584-d15315cb4675-kube-api-access-zfbzs\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.218036 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-default-certificate\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.218099 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr9z8\" (UniqueName: \"kubernetes.io/projected/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-kube-api-access-vr9z8\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.218140 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c235fb96-c626-4949-bd6e-a21dd37bc9d1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.218158 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-oauth-serving-cert\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.218201 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49c74c66-25d9-4f7c-a9d0-8d17b8387aa8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-86n6q\" (UID: \"49c74c66-25d9-4f7c-a9d0-8d17b8387aa8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.218222 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c74c66-25d9-4f7c-a9d0-8d17b8387aa8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-86n6q\" (UID: \"49c74c66-25d9-4f7c-a9d0-8d17b8387aa8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.218244 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d01d215-bf1a-4d62-8584-d15315cb4675-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.218262 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c8d46051-64c9-4b63-b3de-c50f8943a37c-etcd-client\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.218456 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08faaabe-18ef-488b-b8d9-d4709136bcd6-metrics-tls\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.218486 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f186724-4abc-41b9-9e11-c40588de02ea-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cxtdf\" (UID: \"2f186724-4abc-41b9-9e11-c40588de02ea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.218539 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-stats-auth\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.228907 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nhcln\" (UID: \"68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.229080 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nm5b\" (UniqueName: \"kubernetes.io/projected/2f186724-4abc-41b9-9e11-c40588de02ea-kube-api-access-2nm5b\") pod \"kube-storage-version-migrator-operator-b67b599dd-cxtdf\" (UID: \"2f186724-4abc-41b9-9e11-c40588de02ea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.229263 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d46051-64c9-4b63-b3de-c50f8943a37c-config\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.229916 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c370085-8ba2-4be8-9290-ff2f37189fae-trusted-ca\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.230069 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp5kj\" (UniqueName: \"kubernetes.io/projected/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-kube-api-access-qp5kj\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.230324 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqfx6\" (UniqueName: \"kubernetes.io/projected/68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4-kube-api-access-bqfx6\") pod \"cluster-samples-operator-665b6dd947-nhcln\" (UID: \"68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.230416 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82ns2\" (UniqueName: \"kubernetes.io/projected/9777bea8-2829-45ff-85c1-69f25ff7f5ce-kube-api-access-82ns2\") pod \"migrator-59844c95c7-8pfs5\" (UID: \"9777bea8-2829-45ff-85c1-69f25ff7f5ce\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.230447 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f186724-4abc-41b9-9e11-c40588de02ea-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cxtdf\" (UID: \"2f186724-4abc-41b9-9e11-c40588de02ea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.230819 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.230905 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08faaabe-18ef-488b-b8d9-d4709136bcd6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.230969 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49c74c66-25d9-4f7c-a9d0-8d17b8387aa8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-86n6q\" (UID: \"49c74c66-25d9-4f7c-a9d0-8d17b8387aa8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.231012 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-serving-cert\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.231478 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:27.731463333 +0000 UTC m=+149.583485771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.232201 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-certificates\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.232230 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rfqgz\" (UID: \"d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.232793 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkxzr\" (UniqueName: \"kubernetes.io/projected/c8d46051-64c9-4b63-b3de-c50f8943a37c-kube-api-access-zkxzr\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.233194 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29c2c126-2c52-4332-b622-a73f6b7f09c1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dwkxc\" (UID: \"29c2c126-2c52-4332-b622-a73f6b7f09c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.233271 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxgk6\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-kube-api-access-nxgk6\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.233415 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4-config\") pod \"kube-controller-manager-operator-78b949d7b-rfqgz\" (UID: \"d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.233920 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-service-ca\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.233975 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c8d46051-64c9-4b63-b3de-c50f8943a37c-etcd-ca\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.234024 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c8d46051-64c9-4b63-b3de-c50f8943a37c-etcd-service-ca\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.234178 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rfqgz\" (UID: \"d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.234213 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c370085-8ba2-4be8-9290-ff2f37189fae-serving-cert\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.234358 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-tls\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.234433 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-trusted-ca\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.234466 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ptwq\" (UniqueName: \"kubernetes.io/projected/6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d-kube-api-access-8ptwq\") pod \"downloads-7954f5f757-5m27d\" (UID: \"6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d\") " pod="openshift-console/downloads-7954f5f757-5m27d" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.234493 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/15605bda-e3da-46db-8c0d-9ebfadab8bbb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lvgzh\" (UID: \"15605bda-e3da-46db-8c0d-9ebfadab8bbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.235064 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-oauth-config\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.235131 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-metrics-certs\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.235613 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhzwb\" (UniqueName: \"kubernetes.io/projected/08faaabe-18ef-488b-b8d9-d4709136bcd6-kube-api-access-dhzwb\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.235844 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8z86\" (UniqueName: \"kubernetes.io/projected/15605bda-e3da-46db-8c0d-9ebfadab8bbb-kube-api-access-m8z86\") pod \"control-plane-machine-set-operator-78cbb6b69f-lvgzh\" (UID: \"15605bda-e3da-46db-8c0d-9ebfadab8bbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.303616 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274"] Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.336971 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.337203 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:27.837171423 +0000 UTC m=+149.689193861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337249 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49c74c66-25d9-4f7c-a9d0-8d17b8387aa8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-86n6q\" (UID: \"49c74c66-25d9-4f7c-a9d0-8d17b8387aa8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337295 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfqpw\" (UniqueName: \"kubernetes.io/projected/da9394b5-19ce-4482-b107-9339d9813a25-kube-api-access-mfqpw\") pod \"collect-profiles-29430105-2hmzt\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337330 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-serving-cert\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337359 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9669w\" (UniqueName: \"kubernetes.io/projected/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-kube-api-access-9669w\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337384 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kftcz\" (UniqueName: \"kubernetes.io/projected/308314c2-75ff-4d16-9c99-15574bb6a3d8-kube-api-access-kftcz\") pod \"multus-admission-controller-857f4d67dd-5qmc9\" (UID: \"308314c2-75ff-4d16-9c99-15574bb6a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337418 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-certificates\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337443 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rfqgz\" (UID: \"d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337473 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkxzr\" (UniqueName: \"kubernetes.io/projected/c8d46051-64c9-4b63-b3de-c50f8943a37c-kube-api-access-zkxzr\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337498 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/854fdfac-41a5-473c-b955-95b2ecae9856-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dsm7q\" (UID: \"854fdfac-41a5-473c-b955-95b2ecae9856\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337521 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-images\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337548 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-plugins-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337571 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1-signing-cabundle\") pod \"service-ca-9c57cc56f-vvw2l\" (UID: \"abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1\") " pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337611 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29c2c126-2c52-4332-b622-a73f6b7f09c1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dwkxc\" (UID: \"29c2c126-2c52-4332-b622-a73f6b7f09c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337636 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4wv\" (UniqueName: \"kubernetes.io/projected/03a11d34-91da-442e-a26c-a7a5453374ab-kube-api-access-xc4wv\") pod \"olm-operator-6b444d44fb-bbg49\" (UID: \"03a11d34-91da-442e-a26c-a7a5453374ab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337662 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tvkw5\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337685 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4-config\") pod \"kube-controller-manager-operator-78b949d7b-rfqgz\" (UID: \"d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337710 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjwnv\" (UniqueName: \"kubernetes.io/projected/483b4762-71e8-4fb2-9d6a-2daa57a596df-kube-api-access-bjwnv\") pod \"service-ca-operator-777779d784-59wlr\" (UID: \"483b4762-71e8-4fb2-9d6a-2daa57a596df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337734 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxgk6\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-kube-api-access-nxgk6\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337760 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fb1b4e69-9921-4aef-b791-5408f51c4f71-profile-collector-cert\") pod \"catalog-operator-68c6474976-682gk\" (UID: \"fb1b4e69-9921-4aef-b791-5408f51c4f71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337798 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-service-ca\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337821 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c8d46051-64c9-4b63-b3de-c50f8943a37c-etcd-service-ca\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337845 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c8d46051-64c9-4b63-b3de-c50f8943a37c-etcd-ca\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337867 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-csi-data-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337891 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rfqgz\" (UID: \"d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337912 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c370085-8ba2-4be8-9290-ff2f37189fae-serving-cert\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337933 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-tls\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337954 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-trusted-ca\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.337977 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ptwq\" (UniqueName: \"kubernetes.io/projected/6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d-kube-api-access-8ptwq\") pod \"downloads-7954f5f757-5m27d\" (UID: \"6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d\") " pod="openshift-console/downloads-7954f5f757-5m27d" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338000 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/15605bda-e3da-46db-8c0d-9ebfadab8bbb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lvgzh\" (UID: \"15605bda-e3da-46db-8c0d-9ebfadab8bbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338025 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-proxy-tls\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338047 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-oauth-config\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338067 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-metrics-certs\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338108 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhzwb\" (UniqueName: \"kubernetes.io/projected/08faaabe-18ef-488b-b8d9-d4709136bcd6-kube-api-access-dhzwb\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338132 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxcrb\" (UniqueName: \"kubernetes.io/projected/39567a2d-ea06-4b63-ac4f-9615807a599c-kube-api-access-qxcrb\") pod \"dns-default-kpl54\" (UID: \"39567a2d-ea06-4b63-ac4f-9615807a599c\") " pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338154 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l62k9\" (UniqueName: \"kubernetes.io/projected/fb1b4e69-9921-4aef-b791-5408f51c4f71-kube-api-access-l62k9\") pod \"catalog-operator-68c6474976-682gk\" (UID: \"fb1b4e69-9921-4aef-b791-5408f51c4f71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338235 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8z86\" (UniqueName: \"kubernetes.io/projected/15605bda-e3da-46db-8c0d-9ebfadab8bbb-kube-api-access-m8z86\") pod \"control-plane-machine-set-operator-78cbb6b69f-lvgzh\" (UID: \"15605bda-e3da-46db-8c0d-9ebfadab8bbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338258 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-webhook-cert\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338281 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp9hv\" (UniqueName: \"kubernetes.io/projected/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-kube-api-access-vp9hv\") pod \"marketplace-operator-79b997595-tvkw5\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338319 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-config\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338340 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08faaabe-18ef-488b-b8d9-d4709136bcd6-trusted-ca\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338372 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gqqg\" (UniqueName: \"kubernetes.io/projected/cf38d336-cbd8-4f99-9aa0-4a826979456c-kube-api-access-2gqqg\") pod \"machine-config-controller-84d6567774-mvg8j\" (UID: \"cf38d336-cbd8-4f99-9aa0-4a826979456c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338402 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-service-ca-bundle\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338425 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j859v\" (UniqueName: \"kubernetes.io/projected/abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1-kube-api-access-j859v\") pod \"service-ca-9c57cc56f-vvw2l\" (UID: \"abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1\") " pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338448 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c235fb96-c626-4949-bd6e-a21dd37bc9d1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338474 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tvkw5\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338501 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29c2c126-2c52-4332-b622-a73f6b7f09c1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dwkxc\" (UID: \"29c2c126-2c52-4332-b622-a73f6b7f09c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338524 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-mountpoint-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338551 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d01d215-bf1a-4d62-8584-d15315cb4675-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338602 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d46051-64c9-4b63-b3de-c50f8943a37c-serving-cert\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338639 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d01d215-bf1a-4d62-8584-d15315cb4675-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338675 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-trusted-ca-bundle\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338698 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c370085-8ba2-4be8-9290-ff2f37189fae-config\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338721 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-bound-sa-token\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338744 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29c2c126-2c52-4332-b622-a73f6b7f09c1-config\") pod \"kube-apiserver-operator-766d6c64bb-dwkxc\" (UID: \"29c2c126-2c52-4332-b622-a73f6b7f09c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338769 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/483b4762-71e8-4fb2-9d6a-2daa57a596df-serving-cert\") pod \"service-ca-operator-777779d784-59wlr\" (UID: \"483b4762-71e8-4fb2-9d6a-2daa57a596df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338789 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/308314c2-75ff-4d16-9c99-15574bb6a3d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5qmc9\" (UID: \"308314c2-75ff-4d16-9c99-15574bb6a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338812 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483b4762-71e8-4fb2-9d6a-2daa57a596df-config\") pod \"service-ca-operator-777779d784-59wlr\" (UID: \"483b4762-71e8-4fb2-9d6a-2daa57a596df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338836 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzn5x\" (UniqueName: \"kubernetes.io/projected/2c370085-8ba2-4be8-9290-ff2f37189fae-kube-api-access-nzn5x\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338857 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da9394b5-19ce-4482-b107-9339d9813a25-config-volume\") pod \"collect-profiles-29430105-2hmzt\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338885 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-default-certificate\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338908 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfbzs\" (UniqueName: \"kubernetes.io/projected/2d01d215-bf1a-4d62-8584-d15315cb4675-kube-api-access-zfbzs\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338934 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr2gs\" (UniqueName: \"kubernetes.io/projected/8900322d-6a30-4b78-807c-ef710a33d219-kube-api-access-hr2gs\") pod \"ingress-canary-pdzrx\" (UID: \"8900322d-6a30-4b78-807c-ef710a33d219\") " pod="openshift-ingress-canary/ingress-canary-pdzrx" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338955 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1-signing-key\") pod \"service-ca-9c57cc56f-vvw2l\" (UID: \"abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1\") " pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.338995 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr9z8\" (UniqueName: \"kubernetes.io/projected/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-kube-api-access-vr9z8\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339032 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da9394b5-19ce-4482-b107-9339d9813a25-secret-volume\") pod \"collect-profiles-29430105-2hmzt\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339057 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c235fb96-c626-4949-bd6e-a21dd37bc9d1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339077 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-oauth-serving-cert\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339099 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c74c66-25d9-4f7c-a9d0-8d17b8387aa8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-86n6q\" (UID: \"49c74c66-25d9-4f7c-a9d0-8d17b8387aa8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339122 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49c74c66-25d9-4f7c-a9d0-8d17b8387aa8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-86n6q\" (UID: \"49c74c66-25d9-4f7c-a9d0-8d17b8387aa8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339148 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fb1b4e69-9921-4aef-b791-5408f51c4f71-srv-cert\") pod \"catalog-operator-68c6474976-682gk\" (UID: \"fb1b4e69-9921-4aef-b791-5408f51c4f71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339171 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d01d215-bf1a-4d62-8584-d15315cb4675-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339193 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c8d46051-64c9-4b63-b3de-c50f8943a37c-etcd-client\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339214 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/03a11d34-91da-442e-a26c-a7a5453374ab-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bbg49\" (UID: \"03a11d34-91da-442e-a26c-a7a5453374ab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339241 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-tmpfs\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339262 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-registration-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339304 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08faaabe-18ef-488b-b8d9-d4709136bcd6-metrics-tls\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339342 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f186724-4abc-41b9-9e11-c40588de02ea-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cxtdf\" (UID: \"2f186724-4abc-41b9-9e11-c40588de02ea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339365 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shmmn\" (UniqueName: \"kubernetes.io/projected/77efa6c3-7ad2-40d0-83f6-17db5e5c5c90-kube-api-access-shmmn\") pod \"machine-config-server-jpcm8\" (UID: \"77efa6c3-7ad2-40d0-83f6-17db5e5c5c90\") " pod="openshift-machine-config-operator/machine-config-server-jpcm8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339393 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/77efa6c3-7ad2-40d0-83f6-17db5e5c5c90-node-bootstrap-token\") pod \"machine-config-server-jpcm8\" (UID: \"77efa6c3-7ad2-40d0-83f6-17db5e5c5c90\") " pod="openshift-machine-config-operator/machine-config-server-jpcm8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339414 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf38d336-cbd8-4f99-9aa0-4a826979456c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mvg8j\" (UID: \"cf38d336-cbd8-4f99-9aa0-4a826979456c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339436 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-apiservice-cert\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339490 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-stats-auth\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339514 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nhcln\" (UID: \"68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339560 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nm5b\" (UniqueName: \"kubernetes.io/projected/2f186724-4abc-41b9-9e11-c40588de02ea-kube-api-access-2nm5b\") pod \"kube-storage-version-migrator-operator-b67b599dd-cxtdf\" (UID: \"2f186724-4abc-41b9-9e11-c40588de02ea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339627 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d46051-64c9-4b63-b3de-c50f8943a37c-config\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339649 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/77efa6c3-7ad2-40d0-83f6-17db5e5c5c90-certs\") pod \"machine-config-server-jpcm8\" (UID: \"77efa6c3-7ad2-40d0-83f6-17db5e5c5c90\") " pod="openshift-machine-config-operator/machine-config-server-jpcm8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339672 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c370085-8ba2-4be8-9290-ff2f37189fae-trusted-ca\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339695 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03a11d34-91da-442e-a26c-a7a5453374ab-srv-cert\") pod \"olm-operator-6b444d44fb-bbg49\" (UID: \"03a11d34-91da-442e-a26c-a7a5453374ab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339716 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39567a2d-ea06-4b63-ac4f-9615807a599c-metrics-tls\") pod \"dns-default-kpl54\" (UID: \"39567a2d-ea06-4b63-ac4f-9615807a599c\") " pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339740 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8900322d-6a30-4b78-807c-ef710a33d219-cert\") pod \"ingress-canary-pdzrx\" (UID: \"8900322d-6a30-4b78-807c-ef710a33d219\") " pod="openshift-ingress-canary/ingress-canary-pdzrx" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339763 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr9g2\" (UniqueName: \"kubernetes.io/projected/854fdfac-41a5-473c-b955-95b2ecae9856-kube-api-access-jr9g2\") pod \"package-server-manager-789f6589d5-dsm7q\" (UID: \"854fdfac-41a5-473c-b955-95b2ecae9856\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339793 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j47ht\" (UniqueName: \"kubernetes.io/projected/3f106ed7-7479-46b8-ab6f-e6b291078caa-kube-api-access-j47ht\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339821 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp5kj\" (UniqueName: \"kubernetes.io/projected/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-kube-api-access-qp5kj\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339845 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqfx6\" (UniqueName: \"kubernetes.io/projected/68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4-kube-api-access-bqfx6\") pod \"cluster-samples-operator-665b6dd947-nhcln\" (UID: \"68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339872 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82ns2\" (UniqueName: \"kubernetes.io/projected/9777bea8-2829-45ff-85c1-69f25ff7f5ce-kube-api-access-82ns2\") pod \"migrator-59844c95c7-8pfs5\" (UID: \"9777bea8-2829-45ff-85c1-69f25ff7f5ce\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339896 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f186724-4abc-41b9-9e11-c40588de02ea-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cxtdf\" (UID: \"2f186724-4abc-41b9-9e11-c40588de02ea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339921 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf38d336-cbd8-4f99-9aa0-4a826979456c-proxy-tls\") pod \"machine-config-controller-84d6567774-mvg8j\" (UID: \"cf38d336-cbd8-4f99-9aa0-4a826979456c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339943 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.339987 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39567a2d-ea06-4b63-ac4f-9615807a599c-config-volume\") pod \"dns-default-kpl54\" (UID: \"39567a2d-ea06-4b63-ac4f-9615807a599c\") " pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.340009 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-socket-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.340043 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.340066 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08faaabe-18ef-488b-b8d9-d4709136bcd6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.340093 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmt62\" (UniqueName: \"kubernetes.io/projected/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-kube-api-access-vmt62\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.345138 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-trusted-ca-bundle\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.346010 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49c74c66-25d9-4f7c-a9d0-8d17b8387aa8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-86n6q\" (UID: \"49c74c66-25d9-4f7c-a9d0-8d17b8387aa8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.346842 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c370085-8ba2-4be8-9290-ff2f37189fae-config\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.347887 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29c2c126-2c52-4332-b622-a73f6b7f09c1-config\") pod \"kube-apiserver-operator-766d6c64bb-dwkxc\" (UID: \"29c2c126-2c52-4332-b622-a73f6b7f09c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.348174 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-serving-cert\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.349525 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-certificates\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.350343 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d46051-64c9-4b63-b3de-c50f8943a37c-config\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.351701 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c370085-8ba2-4be8-9290-ff2f37189fae-trusted-ca\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.351882 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f186724-4abc-41b9-9e11-c40588de02ea-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cxtdf\" (UID: \"2f186724-4abc-41b9-9e11-c40588de02ea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.351943 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29c2c126-2c52-4332-b622-a73f6b7f09c1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dwkxc\" (UID: \"29c2c126-2c52-4332-b622-a73f6b7f09c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.353417 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4-config\") pod \"kube-controller-manager-operator-78b949d7b-rfqgz\" (UID: \"d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.353538 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-default-certificate\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.353964 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49c74c66-25d9-4f7c-a9d0-8d17b8387aa8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-86n6q\" (UID: \"49c74c66-25d9-4f7c-a9d0-8d17b8387aa8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.354499 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-config\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.354502 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-service-ca-bundle\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.354827 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c235fb96-c626-4949-bd6e-a21dd37bc9d1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.355041 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-trusted-ca\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.355143 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:27.855128455 +0000 UTC m=+149.707150893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.355518 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c8d46051-64c9-4b63-b3de-c50f8943a37c-etcd-client\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.355630 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d46051-64c9-4b63-b3de-c50f8943a37c-serving-cert\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.355982 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c8d46051-64c9-4b63-b3de-c50f8943a37c-etcd-ca\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.356271 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-oauth-serving-cert\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.356359 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-service-ca\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.356923 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c8d46051-64c9-4b63-b3de-c50f8943a37c-etcd-service-ca\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.358206 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/15605bda-e3da-46db-8c0d-9ebfadab8bbb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lvgzh\" (UID: \"15605bda-e3da-46db-8c0d-9ebfadab8bbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.357714 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c235fb96-c626-4949-bd6e-a21dd37bc9d1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.358516 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f186724-4abc-41b9-9e11-c40588de02ea-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cxtdf\" (UID: \"2f186724-4abc-41b9-9e11-c40588de02ea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.358536 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-tls\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.358770 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c370085-8ba2-4be8-9290-ff2f37189fae-serving-cert\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.358924 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08faaabe-18ef-488b-b8d9-d4709136bcd6-trusted-ca\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.359028 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-oauth-config\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.359819 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d01d215-bf1a-4d62-8584-d15315cb4675-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.360133 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rfqgz\" (UID: \"d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.362364 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29c2c126-2c52-4332-b622-a73f6b7f09c1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dwkxc\" (UID: \"29c2c126-2c52-4332-b622-a73f6b7f09c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.362518 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nhcln\" (UID: \"68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.370427 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-metrics-certs\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.371031 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-stats-auth\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.379251 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08faaabe-18ef-488b-b8d9-d4709136bcd6-metrics-tls\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.379667 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d01d215-bf1a-4d62-8584-d15315cb4675-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.382963 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d01d215-bf1a-4d62-8584-d15315cb4675-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.396385 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.427236 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr9z8\" (UniqueName: \"kubernetes.io/projected/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-kube-api-access-vr9z8\") pod \"console-f9d7485db-rfx4t\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.440393 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-bound-sa-token\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.440727 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.440932 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:27.94090098 +0000 UTC m=+149.792923418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.441052 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-webhook-cert\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.441164 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp9hv\" (UniqueName: \"kubernetes.io/projected/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-kube-api-access-vp9hv\") pod \"marketplace-operator-79b997595-tvkw5\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.441273 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gqqg\" (UniqueName: \"kubernetes.io/projected/cf38d336-cbd8-4f99-9aa0-4a826979456c-kube-api-access-2gqqg\") pod \"machine-config-controller-84d6567774-mvg8j\" (UID: \"cf38d336-cbd8-4f99-9aa0-4a826979456c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.441374 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j859v\" (UniqueName: \"kubernetes.io/projected/abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1-kube-api-access-j859v\") pod \"service-ca-9c57cc56f-vvw2l\" (UID: \"abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1\") " pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.441463 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tvkw5\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.441556 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-mountpoint-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.441696 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/483b4762-71e8-4fb2-9d6a-2daa57a596df-serving-cert\") pod \"service-ca-operator-777779d784-59wlr\" (UID: \"483b4762-71e8-4fb2-9d6a-2daa57a596df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.441794 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/308314c2-75ff-4d16-9c99-15574bb6a3d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5qmc9\" (UID: \"308314c2-75ff-4d16-9c99-15574bb6a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.441905 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483b4762-71e8-4fb2-9d6a-2daa57a596df-config\") pod \"service-ca-operator-777779d784-59wlr\" (UID: \"483b4762-71e8-4fb2-9d6a-2daa57a596df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.442014 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da9394b5-19ce-4482-b107-9339d9813a25-config-volume\") pod \"collect-profiles-29430105-2hmzt\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.442117 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr2gs\" (UniqueName: \"kubernetes.io/projected/8900322d-6a30-4b78-807c-ef710a33d219-kube-api-access-hr2gs\") pod \"ingress-canary-pdzrx\" (UID: \"8900322d-6a30-4b78-807c-ef710a33d219\") " pod="openshift-ingress-canary/ingress-canary-pdzrx" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.442213 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1-signing-key\") pod \"service-ca-9c57cc56f-vvw2l\" (UID: \"abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1\") " pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.442313 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da9394b5-19ce-4482-b107-9339d9813a25-secret-volume\") pod \"collect-profiles-29430105-2hmzt\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.442418 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fb1b4e69-9921-4aef-b791-5408f51c4f71-srv-cert\") pod \"catalog-operator-68c6474976-682gk\" (UID: \"fb1b4e69-9921-4aef-b791-5408f51c4f71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.442525 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/03a11d34-91da-442e-a26c-a7a5453374ab-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bbg49\" (UID: \"03a11d34-91da-442e-a26c-a7a5453374ab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.442634 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-tmpfs\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.442730 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-registration-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.442831 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shmmn\" (UniqueName: \"kubernetes.io/projected/77efa6c3-7ad2-40d0-83f6-17db5e5c5c90-kube-api-access-shmmn\") pod \"machine-config-server-jpcm8\" (UID: \"77efa6c3-7ad2-40d0-83f6-17db5e5c5c90\") " pod="openshift-machine-config-operator/machine-config-server-jpcm8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.442923 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/77efa6c3-7ad2-40d0-83f6-17db5e5c5c90-node-bootstrap-token\") pod \"machine-config-server-jpcm8\" (UID: \"77efa6c3-7ad2-40d0-83f6-17db5e5c5c90\") " pod="openshift-machine-config-operator/machine-config-server-jpcm8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443010 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf38d336-cbd8-4f99-9aa0-4a826979456c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mvg8j\" (UID: \"cf38d336-cbd8-4f99-9aa0-4a826979456c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443102 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-apiservice-cert\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443210 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/77efa6c3-7ad2-40d0-83f6-17db5e5c5c90-certs\") pod \"machine-config-server-jpcm8\" (UID: \"77efa6c3-7ad2-40d0-83f6-17db5e5c5c90\") " pod="openshift-machine-config-operator/machine-config-server-jpcm8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443300 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03a11d34-91da-442e-a26c-a7a5453374ab-srv-cert\") pod \"olm-operator-6b444d44fb-bbg49\" (UID: \"03a11d34-91da-442e-a26c-a7a5453374ab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443387 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8900322d-6a30-4b78-807c-ef710a33d219-cert\") pod \"ingress-canary-pdzrx\" (UID: \"8900322d-6a30-4b78-807c-ef710a33d219\") " pod="openshift-ingress-canary/ingress-canary-pdzrx" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443471 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr9g2\" (UniqueName: \"kubernetes.io/projected/854fdfac-41a5-473c-b955-95b2ecae9856-kube-api-access-jr9g2\") pod \"package-server-manager-789f6589d5-dsm7q\" (UID: \"854fdfac-41a5-473c-b955-95b2ecae9856\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443555 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39567a2d-ea06-4b63-ac4f-9615807a599c-metrics-tls\") pod \"dns-default-kpl54\" (UID: \"39567a2d-ea06-4b63-ac4f-9615807a599c\") " pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443688 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j47ht\" (UniqueName: \"kubernetes.io/projected/3f106ed7-7479-46b8-ab6f-e6b291078caa-kube-api-access-j47ht\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443811 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf38d336-cbd8-4f99-9aa0-4a826979456c-proxy-tls\") pod \"machine-config-controller-84d6567774-mvg8j\" (UID: \"cf38d336-cbd8-4f99-9aa0-4a826979456c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443902 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443992 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39567a2d-ea06-4b63-ac4f-9615807a599c-config-volume\") pod \"dns-default-kpl54\" (UID: \"39567a2d-ea06-4b63-ac4f-9615807a599c\") " pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444081 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-socket-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444178 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444296 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmt62\" (UniqueName: \"kubernetes.io/projected/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-kube-api-access-vmt62\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444381 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-tmpfs\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444388 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfqpw\" (UniqueName: \"kubernetes.io/projected/da9394b5-19ce-4482-b107-9339d9813a25-kube-api-access-mfqpw\") pod \"collect-profiles-29430105-2hmzt\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444483 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9669w\" (UniqueName: \"kubernetes.io/projected/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-kube-api-access-9669w\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444506 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kftcz\" (UniqueName: \"kubernetes.io/projected/308314c2-75ff-4d16-9c99-15574bb6a3d8-kube-api-access-kftcz\") pod \"multus-admission-controller-857f4d67dd-5qmc9\" (UID: \"308314c2-75ff-4d16-9c99-15574bb6a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444560 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/854fdfac-41a5-473c-b955-95b2ecae9856-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dsm7q\" (UID: \"854fdfac-41a5-473c-b955-95b2ecae9856\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444604 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-images\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444651 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-plugins-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444677 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1-signing-cabundle\") pod \"service-ca-9c57cc56f-vvw2l\" (UID: \"abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1\") " pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444701 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4wv\" (UniqueName: \"kubernetes.io/projected/03a11d34-91da-442e-a26c-a7a5453374ab-kube-api-access-xc4wv\") pod \"olm-operator-6b444d44fb-bbg49\" (UID: \"03a11d34-91da-442e-a26c-a7a5453374ab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444745 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tvkw5\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444786 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjwnv\" (UniqueName: \"kubernetes.io/projected/483b4762-71e8-4fb2-9d6a-2daa57a596df-kube-api-access-bjwnv\") pod \"service-ca-operator-777779d784-59wlr\" (UID: \"483b4762-71e8-4fb2-9d6a-2daa57a596df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444808 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fb1b4e69-9921-4aef-b791-5408f51c4f71-profile-collector-cert\") pod \"catalog-operator-68c6474976-682gk\" (UID: \"fb1b4e69-9921-4aef-b791-5408f51c4f71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444832 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-csi-data-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444861 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-proxy-tls\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444907 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxcrb\" (UniqueName: \"kubernetes.io/projected/39567a2d-ea06-4b63-ac4f-9615807a599c-kube-api-access-qxcrb\") pod \"dns-default-kpl54\" (UID: \"39567a2d-ea06-4b63-ac4f-9615807a599c\") " pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.444929 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l62k9\" (UniqueName: \"kubernetes.io/projected/fb1b4e69-9921-4aef-b791-5408f51c4f71-kube-api-access-l62k9\") pod \"catalog-operator-68c6474976-682gk\" (UID: \"fb1b4e69-9921-4aef-b791-5408f51c4f71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.445690 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483b4762-71e8-4fb2-9d6a-2daa57a596df-config\") pod \"service-ca-operator-777779d784-59wlr\" (UID: \"483b4762-71e8-4fb2-9d6a-2daa57a596df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.446060 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-registration-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.443689 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-mountpoint-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.446458 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/308314c2-75ff-4d16-9c99-15574bb6a3d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5qmc9\" (UID: \"308314c2-75ff-4d16-9c99-15574bb6a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.446694 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da9394b5-19ce-4482-b107-9339d9813a25-config-volume\") pod \"collect-profiles-29430105-2hmzt\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.446992 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.447239 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39567a2d-ea06-4b63-ac4f-9615807a599c-config-volume\") pod \"dns-default-kpl54\" (UID: \"39567a2d-ea06-4b63-ac4f-9615807a599c\") " pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.447956 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-apiservice-cert\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.448171 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf38d336-cbd8-4f99-9aa0-4a826979456c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mvg8j\" (UID: \"cf38d336-cbd8-4f99-9aa0-4a826979456c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.448201 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-socket-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.448430 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:27.948414422 +0000 UTC m=+149.800436960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.448605 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1-signing-cabundle\") pod \"service-ca-9c57cc56f-vvw2l\" (UID: \"abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1\") " pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.448999 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-plugins-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.449163 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-images\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.449618 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-webhook-cert\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.449794 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3f106ed7-7479-46b8-ab6f-e6b291078caa-csi-data-dir\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.451124 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39567a2d-ea06-4b63-ac4f-9615807a599c-metrics-tls\") pod \"dns-default-kpl54\" (UID: \"39567a2d-ea06-4b63-ac4f-9615807a599c\") " pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.451163 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/483b4762-71e8-4fb2-9d6a-2daa57a596df-serving-cert\") pod \"service-ca-operator-777779d784-59wlr\" (UID: \"483b4762-71e8-4fb2-9d6a-2daa57a596df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.451217 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tvkw5\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.453111 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8900322d-6a30-4b78-807c-ef710a33d219-cert\") pod \"ingress-canary-pdzrx\" (UID: \"8900322d-6a30-4b78-807c-ef710a33d219\") " pod="openshift-ingress-canary/ingress-canary-pdzrx" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.455229 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/854fdfac-41a5-473c-b955-95b2ecae9856-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dsm7q\" (UID: \"854fdfac-41a5-473c-b955-95b2ecae9856\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.455281 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf38d336-cbd8-4f99-9aa0-4a826979456c-proxy-tls\") pod \"machine-config-controller-84d6567774-mvg8j\" (UID: \"cf38d336-cbd8-4f99-9aa0-4a826979456c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.455282 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fb1b4e69-9921-4aef-b791-5408f51c4f71-profile-collector-cert\") pod \"catalog-operator-68c6474976-682gk\" (UID: \"fb1b4e69-9921-4aef-b791-5408f51c4f71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.455699 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03a11d34-91da-442e-a26c-a7a5453374ab-srv-cert\") pod \"olm-operator-6b444d44fb-bbg49\" (UID: \"03a11d34-91da-442e-a26c-a7a5453374ab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.457159 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-proxy-tls\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.457649 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/03a11d34-91da-442e-a26c-a7a5453374ab-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bbg49\" (UID: \"03a11d34-91da-442e-a26c-a7a5453374ab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.458772 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1-signing-key\") pod \"service-ca-9c57cc56f-vvw2l\" (UID: \"abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1\") " pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.458889 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fb1b4e69-9921-4aef-b791-5408f51c4f71-srv-cert\") pod \"catalog-operator-68c6474976-682gk\" (UID: \"fb1b4e69-9921-4aef-b791-5408f51c4f71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.459140 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tvkw5\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.460883 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/77efa6c3-7ad2-40d0-83f6-17db5e5c5c90-node-bootstrap-token\") pod \"machine-config-server-jpcm8\" (UID: \"77efa6c3-7ad2-40d0-83f6-17db5e5c5c90\") " pod="openshift-machine-config-operator/machine-config-server-jpcm8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.461171 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da9394b5-19ce-4482-b107-9339d9813a25-secret-volume\") pod \"collect-profiles-29430105-2hmzt\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.462846 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/77efa6c3-7ad2-40d0-83f6-17db5e5c5c90-certs\") pod \"machine-config-server-jpcm8\" (UID: \"77efa6c3-7ad2-40d0-83f6-17db5e5c5c90\") " pod="openshift-machine-config-operator/machine-config-server-jpcm8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.474039 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzn5x\" (UniqueName: \"kubernetes.io/projected/2c370085-8ba2-4be8-9290-ff2f37189fae-kube-api-access-nzn5x\") pod \"console-operator-58897d9998-pdnn8\" (UID: \"2c370085-8ba2-4be8-9290-ff2f37189fae\") " pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.507556 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfbzs\" (UniqueName: \"kubernetes.io/projected/2d01d215-bf1a-4d62-8584-d15315cb4675-kube-api-access-zfbzs\") pod \"cluster-image-registry-operator-dc59b4c8b-glkfs\" (UID: \"2d01d215-bf1a-4d62-8584-d15315cb4675\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.514951 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkxzr\" (UniqueName: \"kubernetes.io/projected/c8d46051-64c9-4b63-b3de-c50f8943a37c-kube-api-access-zkxzr\") pod \"etcd-operator-b45778765-shzgd\" (UID: \"c8d46051-64c9-4b63-b3de-c50f8943a37c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.534802 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp5kj\" (UniqueName: \"kubernetes.io/projected/06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7-kube-api-access-qp5kj\") pod \"router-default-5444994796-wd9tc\" (UID: \"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7\") " pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.542272 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqfx6\" (UniqueName: \"kubernetes.io/projected/68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4-kube-api-access-bqfx6\") pod \"cluster-samples-operator-665b6dd947-nhcln\" (UID: \"68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.545634 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.546518 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:28.046502767 +0000 UTC m=+149.898525205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.580946 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" event={"ID":"500b8fbe-cdab-46ef-8c4a-bda78ba3ed37","Type":"ContainerStarted","Data":"0dddd896c9cd541bce20defa45cc1ba0f0b91b5bbdcf9e53b931567a7b12896b"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.585993 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82ns2\" (UniqueName: \"kubernetes.io/projected/9777bea8-2829-45ff-85c1-69f25ff7f5ce-kube-api-access-82ns2\") pod \"migrator-59844c95c7-8pfs5\" (UID: \"9777bea8-2829-45ff-85c1-69f25ff7f5ce\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.588917 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08faaabe-18ef-488b-b8d9-d4709136bcd6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.593894 4794 generic.go:334] "Generic (PLEG): container finished" podID="6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4" containerID="509b67487d471c5702ad2854e28a08309d75996f544c3a87ae57bc5fc6712eba" exitCode=0 Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.593991 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" event={"ID":"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4","Type":"ContainerDied","Data":"509b67487d471c5702ad2854e28a08309d75996f544c3a87ae57bc5fc6712eba"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.600101 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc"] Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.603319 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" event={"ID":"bd8f05be-d624-4ca1-bd92-658fa0a768d6","Type":"ContainerStarted","Data":"e6c0aae8fd711247c755bd01075bf41088d2713553ab0e202cd742c5f58a7331"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.603367 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" event={"ID":"bd8f05be-d624-4ca1-bd92-658fa0a768d6","Type":"ContainerStarted","Data":"59c9354e474bf3ac85922383a1671e22e8684fc6e97370013841e0d5bb33ef7d"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.605350 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8z86\" (UniqueName: \"kubernetes.io/projected/15605bda-e3da-46db-8c0d-9ebfadab8bbb-kube-api-access-m8z86\") pod \"control-plane-machine-set-operator-78cbb6b69f-lvgzh\" (UID: \"15605bda-e3da-46db-8c0d-9ebfadab8bbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.613937 4794 generic.go:334] "Generic (PLEG): container finished" podID="acf556c6-8a5a-4980-b07d-28939b2246ee" containerID="258b775d5f95af4510981c5b808192b806df16b84743520f6a9856236800279a" exitCode=0 Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.614017 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" event={"ID":"acf556c6-8a5a-4980-b07d-28939b2246ee","Type":"ContainerDied","Data":"258b775d5f95af4510981c5b808192b806df16b84743520f6a9856236800279a"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.614051 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" event={"ID":"acf556c6-8a5a-4980-b07d-28939b2246ee","Type":"ContainerStarted","Data":"0357725b559343f0f43e90dc3658bbbf0f1f1f3279c75c0dcd1d861a0f54e4a4"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.624560 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" event={"ID":"28dac958-62ff-4d38-9bf6-a86fa57fb772","Type":"ContainerStarted","Data":"8d3ced1f5a7834f1463053efec1c8c470aefae73bd2d692bbf92a64e6f06d84e"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.624641 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" event={"ID":"28dac958-62ff-4d38-9bf6-a86fa57fb772","Type":"ContainerStarted","Data":"fa06418e6deb67921ed36962cab7f275c5525df2bdbcfad8ac70c0961a29eca0"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.625267 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.639084 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxgk6\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-kube-api-access-nxgk6\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.641229 4794 generic.go:334] "Generic (PLEG): container finished" podID="057601fe-03b1-48d4-8bbc-4482f393d6cb" containerID="c217eef674e40fd57320bb6fbb81c3fe6abccb02b362e4fd743ea3e114c033f5" exitCode=0 Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.641314 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" event={"ID":"057601fe-03b1-48d4-8bbc-4482f393d6cb","Type":"ContainerDied","Data":"c217eef674e40fd57320bb6fbb81c3fe6abccb02b362e4fd743ea3e114c033f5"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.641340 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" event={"ID":"057601fe-03b1-48d4-8bbc-4482f393d6cb","Type":"ContainerStarted","Data":"fcf6469a1af535adbca9a42b59e08f482a07898d4ca8c82c477cfcf029001c45"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.647622 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.648222 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:28.148191598 +0000 UTC m=+150.000214046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.648723 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.657557 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.660074 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49c74c66-25d9-4f7c-a9d0-8d17b8387aa8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-86n6q\" (UID: \"49c74c66-25d9-4f7c-a9d0-8d17b8387aa8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.685725 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" event={"ID":"23e4e919-2929-484f-afe8-e80ddc566e7c","Type":"ContainerStarted","Data":"ea63348063b5827a95cf5eb6936939fa93531af90af599752f2e7de48dea2ecf"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.685832 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" event={"ID":"23e4e919-2929-484f-afe8-e80ddc566e7c","Type":"ContainerStarted","Data":"bd7235065177f7298a6ef0af23b253891bdbc3dc161b5e6bc955576642eed1a8"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.687350 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.694374 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rfqgz\" (UID: \"d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.705969 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.712546 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.718851 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.718963 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" event={"ID":"0d34d344-13c1-4816-8286-2104852b248b","Type":"ContainerStarted","Data":"5f5b48464c301823a309b2139bd03f5f58226e87eb531bb82d1b0d6cc64c6279"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.719004 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" event={"ID":"0d34d344-13c1-4816-8286-2104852b248b","Type":"ContainerStarted","Data":"bd1c6715b220d053bd9994024e361881e0ac6f83bda440a9b3e13613f75a7965"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.719017 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" event={"ID":"0d34d344-13c1-4816-8286-2104852b248b","Type":"ContainerStarted","Data":"f7b87800cc6940ff1a67118945f11b4b65493e06ff668a15177d278ba2f1dca5"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.724381 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-824w4" event={"ID":"e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331","Type":"ContainerStarted","Data":"e38f7bdaf8b0e3a3c798945682ba736e1e79320d84e4913d03d53b8e236b268b"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.724428 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-824w4" event={"ID":"e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331","Type":"ContainerStarted","Data":"ecc3cc02484d4c935334412171e9c4306b67758d2a45a4501ed64b5257e2fcd0"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.724486 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-824w4" event={"ID":"e28d0d4e-67a3-4dc0-a0f5-ea7aac7cb331","Type":"ContainerStarted","Data":"2e037c7e8de6e849aa9e7f9229f98a1a310d0ec98362633994b0976531ed3998"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.725831 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.731989 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.736914 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.746169 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.748322 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.749749 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:28.249728916 +0000 UTC m=+150.101751354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.755416 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nm5b\" (UniqueName: \"kubernetes.io/projected/2f186724-4abc-41b9-9e11-c40588de02ea-kube-api-access-2nm5b\") pod \"kube-storage-version-migrator-operator-b67b599dd-cxtdf\" (UID: \"2f186724-4abc-41b9-9e11-c40588de02ea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.765532 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" event={"ID":"6fa75386-467c-467e-8ef7-27c65cd6a2b5","Type":"ContainerStarted","Data":"4c5e71643eba6e68b8d8fb74163c30bf12c851b026e3b6c043de0575ea3d3ea6"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.765614 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.788775 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.792379 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhzwb\" (UniqueName: \"kubernetes.io/projected/08faaabe-18ef-488b-b8d9-d4709136bcd6-kube-api-access-dhzwb\") pod \"ingress-operator-5b745b69d9-rx78x\" (UID: \"08faaabe-18ef-488b-b8d9-d4709136bcd6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.798133 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ptwq\" (UniqueName: \"kubernetes.io/projected/6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d-kube-api-access-8ptwq\") pod \"downloads-7954f5f757-5m27d\" (UID: \"6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d\") " pod="openshift-console/downloads-7954f5f757-5m27d" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.799212 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gqqg\" (UniqueName: \"kubernetes.io/projected/cf38d336-cbd8-4f99-9aa0-4a826979456c-kube-api-access-2gqqg\") pod \"machine-config-controller-84d6567774-mvg8j\" (UID: \"cf38d336-cbd8-4f99-9aa0-4a826979456c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.814038 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j859v\" (UniqueName: \"kubernetes.io/projected/abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1-kube-api-access-j859v\") pod \"service-ca-9c57cc56f-vvw2l\" (UID: \"abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1\") " pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.818295 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" event={"ID":"fdf86acc-2500-434f-963d-2aaa9219dfa1","Type":"ContainerStarted","Data":"972925bf0a3c3d517c41ee355e1731ce70766f1fc08f503877b429d6993f4fa2"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.818350 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" event={"ID":"fdf86acc-2500-434f-963d-2aaa9219dfa1","Type":"ContainerStarted","Data":"199ec0fd555a9faf85f08cc45b62e9d8ef362a98c7d63ff92479e27d506535db"} Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.818965 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.819976 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.835293 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp9hv\" (UniqueName: \"kubernetes.io/projected/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-kube-api-access-vp9hv\") pod \"marketplace-operator-79b997595-tvkw5\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.844457 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr2gs\" (UniqueName: \"kubernetes.io/projected/8900322d-6a30-4b78-807c-ef710a33d219-kube-api-access-hr2gs\") pod \"ingress-canary-pdzrx\" (UID: \"8900322d-6a30-4b78-807c-ef710a33d219\") " pod="openshift-ingress-canary/ingress-canary-pdzrx" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.848450 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l62k9\" (UniqueName: \"kubernetes.io/projected/fb1b4e69-9921-4aef-b791-5408f51c4f71-kube-api-access-l62k9\") pod \"catalog-operator-68c6474976-682gk\" (UID: \"fb1b4e69-9921-4aef-b791-5408f51c4f71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.850256 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.851798 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:28.351783778 +0000 UTC m=+150.203806216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.870326 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfqpw\" (UniqueName: \"kubernetes.io/projected/da9394b5-19ce-4482-b107-9339d9813a25-kube-api-access-mfqpw\") pod \"collect-profiles-29430105-2hmzt\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.884939 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pdzrx" Dec 15 13:56:27 crc kubenswrapper[4794]: W1215 13:56:27.901813 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06366fe3_a2f0_4ff6_b2ec_61d1d9a736e7.slice/crio-dd97fa6103546ab971b5619a05211929357efbea53ed46ab9e3ccbdb9f95e087 WatchSource:0}: Error finding container dd97fa6103546ab971b5619a05211929357efbea53ed46ab9e3ccbdb9f95e087: Status 404 returned error can't find the container with id dd97fa6103546ab971b5619a05211929357efbea53ed46ab9e3ccbdb9f95e087 Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.902357 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shmmn\" (UniqueName: \"kubernetes.io/projected/77efa6c3-7ad2-40d0-83f6-17db5e5c5c90-kube-api-access-shmmn\") pod \"machine-config-server-jpcm8\" (UID: \"77efa6c3-7ad2-40d0-83f6-17db5e5c5c90\") " pod="openshift-machine-config-operator/machine-config-server-jpcm8" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.924631 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4wv\" (UniqueName: \"kubernetes.io/projected/03a11d34-91da-442e-a26c-a7a5453374ab-kube-api-access-xc4wv\") pod \"olm-operator-6b444d44fb-bbg49\" (UID: \"03a11d34-91da-442e-a26c-a7a5453374ab\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.942393 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9669w\" (UniqueName: \"kubernetes.io/projected/f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e-kube-api-access-9669w\") pod \"packageserver-d55dfcdfc-nbcfc\" (UID: \"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.947227 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kftcz\" (UniqueName: \"kubernetes.io/projected/308314c2-75ff-4d16-9c99-15574bb6a3d8-kube-api-access-kftcz\") pod \"multus-admission-controller-857f4d67dd-5qmc9\" (UID: \"308314c2-75ff-4d16-9c99-15574bb6a3d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.954183 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5m27d" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.962889 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.963251 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:28.463224982 +0000 UTC m=+150.315247420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.963447 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:27 crc kubenswrapper[4794]: E1215 13:56:27.964098 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:28.464078885 +0000 UTC m=+150.316101403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.968857 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxcrb\" (UniqueName: \"kubernetes.io/projected/39567a2d-ea06-4b63-ac4f-9615807a599c-kube-api-access-qxcrb\") pod \"dns-default-kpl54\" (UID: \"39567a2d-ea06-4b63-ac4f-9615807a599c\") " pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:27 crc kubenswrapper[4794]: I1215 13:56:27.973283 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:27.999986 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.009112 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr9g2\" (UniqueName: \"kubernetes.io/projected/854fdfac-41a5-473c-b955-95b2ecae9856-kube-api-access-jr9g2\") pod \"package-server-manager-789f6589d5-dsm7q\" (UID: \"854fdfac-41a5-473c-b955-95b2ecae9856\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.020557 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pdnn8"] Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.027541 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmt62\" (UniqueName: \"kubernetes.io/projected/6e098c00-1b7c-4f84-bb6e-c2769a4f7a55-kube-api-access-vmt62\") pod \"machine-config-operator-74547568cd-4vc5z\" (UID: \"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.031888 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjwnv\" (UniqueName: \"kubernetes.io/projected/483b4762-71e8-4fb2-9d6a-2daa57a596df-kube-api-access-bjwnv\") pod \"service-ca-operator-777779d784-59wlr\" (UID: \"483b4762-71e8-4fb2-9d6a-2daa57a596df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.040315 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.055215 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.060821 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.064351 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.064724 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:28 crc kubenswrapper[4794]: E1215 13:56:28.064765 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:28.56474958 +0000 UTC m=+150.416772018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.073964 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j47ht\" (UniqueName: \"kubernetes.io/projected/3f106ed7-7479-46b8-ab6f-e6b291078caa-kube-api-access-j47ht\") pod \"csi-hostpathplugin-n7hgc\" (UID: \"3f106ed7-7479-46b8-ab6f-e6b291078caa\") " pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.086277 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.086739 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.095875 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.107876 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.115841 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.123623 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.133188 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.141122 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.146864 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.170366 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:28 crc kubenswrapper[4794]: E1215 13:56:28.170811 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:28.670797789 +0000 UTC m=+150.522820227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.171828 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.187489 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jpcm8" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.193015 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rfx4t"] Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.271844 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:28 crc kubenswrapper[4794]: E1215 13:56:28.275228 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:28.775208884 +0000 UTC m=+150.627231322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.374708 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:28 crc kubenswrapper[4794]: E1215 13:56:28.375181 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:28.875164329 +0000 UTC m=+150.727186767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:28 crc kubenswrapper[4794]: W1215 13:56:28.395059 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa7b6fc7_0eb1_44b4_a69b_d8826bc00e33.slice/crio-f7308c011bdd4f72c35f1891c28eb53cd6df20e3968172b5a6f965cb399d856b WatchSource:0}: Error finding container f7308c011bdd4f72c35f1891c28eb53cd6df20e3968172b5a6f965cb399d856b: Status 404 returned error can't find the container with id f7308c011bdd4f72c35f1891c28eb53cd6df20e3968172b5a6f965cb399d856b Dec 15 13:56:28 crc kubenswrapper[4794]: W1215 13:56:28.421377 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77efa6c3_7ad2_40d0_83f6_17db5e5c5c90.slice/crio-6d5b034a2db25f4b53a51bf77c22fd3f94c796206d0857c5e692bfdf6876d84b WatchSource:0}: Error finding container 6d5b034a2db25f4b53a51bf77c22fd3f94c796206d0857c5e692bfdf6876d84b: Status 404 returned error can't find the container with id 6d5b034a2db25f4b53a51bf77c22fd3f94c796206d0857c5e692bfdf6876d84b Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.479765 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:28 crc kubenswrapper[4794]: E1215 13:56:28.480152 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:28.9801376 +0000 UTC m=+150.832160038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.581524 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:28 crc kubenswrapper[4794]: E1215 13:56:28.581961 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:29.081944155 +0000 UTC m=+150.933966603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.628701 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" podStartSLOduration=131.62867755 podStartE2EDuration="2m11.62867755s" podCreationTimestamp="2025-12-15 13:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:28.616551785 +0000 UTC m=+150.468574233" watchObservedRunningTime="2025-12-15 13:56:28.62867755 +0000 UTC m=+150.480699998" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.691130 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:28 crc kubenswrapper[4794]: E1215 13:56:28.691620 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:29.191601881 +0000 UTC m=+151.043624319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.793680 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:28 crc kubenswrapper[4794]: E1215 13:56:28.794079 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:29.294066974 +0000 UTC m=+151.146089412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.881433 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs"] Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.881768 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" event={"ID":"6d4f0599-ac2b-4b4e-b1d9-a7a813babfe4","Type":"ContainerStarted","Data":"cbf738f9eb8d73421bdae11da43de20c184ac7eb89c6fafd18330d23fc55c923"} Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.881789 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x"] Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.881949 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.890367 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rfx4t" event={"ID":"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33","Type":"ContainerStarted","Data":"f7308c011bdd4f72c35f1891c28eb53cd6df20e3968172b5a6f965cb399d856b"} Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.894353 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:28 crc kubenswrapper[4794]: E1215 13:56:28.894971 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:29.394952214 +0000 UTC m=+151.246974652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.900108 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pdnn8" event={"ID":"2c370085-8ba2-4be8-9290-ff2f37189fae","Type":"ContainerStarted","Data":"187f63376823bb70c9cd3b48a30ce37013e793927b81360a20f6bad61b811d67"} Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.900161 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pdnn8" event={"ID":"2c370085-8ba2-4be8-9290-ff2f37189fae","Type":"ContainerStarted","Data":"117c0c461261263636148cc938ca89d4db6e91fe0aee6526e6778535ded4ceba"} Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.900919 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.910147 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" event={"ID":"acf556c6-8a5a-4980-b07d-28939b2246ee","Type":"ContainerStarted","Data":"7641510fa13b770dd127839209ea901ce962dfb5b9bdf45180d2cb1d528ca21d"} Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.921417 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" event={"ID":"057601fe-03b1-48d4-8bbc-4482f393d6cb","Type":"ContainerStarted","Data":"9e632ccd1ec00779fa3e050fe5e21cfbb64032ca3628c8be0c53bf9fb016d150"} Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.939526 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wd9tc" event={"ID":"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7","Type":"ContainerStarted","Data":"5916dc22fa529b535c79794cffd976bd35c468496321b754fb0cddf154241119"} Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.939595 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wd9tc" event={"ID":"06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7","Type":"ContainerStarted","Data":"dd97fa6103546ab971b5619a05211929357efbea53ed46ab9e3ccbdb9f95e087"} Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.940374 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qmz6h" podStartSLOduration=129.940353744 podStartE2EDuration="2m9.940353744s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:28.881316028 +0000 UTC m=+150.733338466" watchObservedRunningTime="2025-12-15 13:56:28.940353744 +0000 UTC m=+150.792376182" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.941319 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-824w4" podStartSLOduration=129.94131115 podStartE2EDuration="2m9.94131115s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:28.931947268 +0000 UTC m=+150.783969706" watchObservedRunningTime="2025-12-15 13:56:28.94131115 +0000 UTC m=+150.793333598" Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.943022 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln"] Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.951894 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" event={"ID":"29c2c126-2c52-4332-b622-a73f6b7f09c1","Type":"ContainerStarted","Data":"efc5826a7c214e6f2a4b081c0c0f80e935a2d6af2b3a5d7cd8461c32e86adf27"} Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.951958 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" event={"ID":"29c2c126-2c52-4332-b622-a73f6b7f09c1","Type":"ContainerStarted","Data":"f7604d41594a292fb3b019f04d1253472c8be2ea18f526730cec1a7c09b99d98"} Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.965181 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jpcm8" event={"ID":"77efa6c3-7ad2-40d0-83f6-17db5e5c5c90","Type":"ContainerStarted","Data":"6d5b034a2db25f4b53a51bf77c22fd3f94c796206d0857c5e692bfdf6876d84b"} Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.980297 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-shzgd"] Dec 15 13:56:28 crc kubenswrapper[4794]: I1215 13:56:28.997511 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:29 crc kubenswrapper[4794]: E1215 13:56:29.000066 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:29.500055178 +0000 UTC m=+151.352077636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.034127 4794 patch_prober.go:28] interesting pod/console-operator-58897d9998-pdnn8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.034190 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pdnn8" podUID="2c370085-8ba2-4be8-9290-ff2f37189fae" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.098861 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.100437 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j6274" podStartSLOduration=130.100426265 podStartE2EDuration="2m10.100426265s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:29.098035601 +0000 UTC m=+150.950058059" watchObservedRunningTime="2025-12-15 13:56:29.100426265 +0000 UTC m=+150.952448703" Dec 15 13:56:29 crc kubenswrapper[4794]: E1215 13:56:29.100541 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:29.600528148 +0000 UTC m=+151.452550586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.201468 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:29 crc kubenswrapper[4794]: E1215 13:56:29.203450 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:29.703437432 +0000 UTC m=+151.555459870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.315274 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-8fxqs" podStartSLOduration=132.315257557 podStartE2EDuration="2m12.315257557s" podCreationTimestamp="2025-12-15 13:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:29.314181428 +0000 UTC m=+151.166203866" watchObservedRunningTime="2025-12-15 13:56:29.315257557 +0000 UTC m=+151.167279995" Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.315547 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" podStartSLOduration=130.315542024 podStartE2EDuration="2m10.315542024s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:29.286817282 +0000 UTC m=+151.138839720" watchObservedRunningTime="2025-12-15 13:56:29.315542024 +0000 UTC m=+151.167564462" Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.334645 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:29 crc kubenswrapper[4794]: E1215 13:56:29.334947 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:29.834933445 +0000 UTC m=+151.686955883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.366501 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-v76sx" podStartSLOduration=132.366487073 podStartE2EDuration="2m12.366487073s" podCreationTimestamp="2025-12-15 13:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:29.365917488 +0000 UTC m=+151.217939926" watchObservedRunningTime="2025-12-15 13:56:29.366487073 +0000 UTC m=+151.218509511" Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.435695 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:29 crc kubenswrapper[4794]: E1215 13:56:29.436204 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:29.936190806 +0000 UTC m=+151.788213244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.522544 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hsk8w" podStartSLOduration=132.522530665 podStartE2EDuration="2m12.522530665s" podCreationTimestamp="2025-12-15 13:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:29.476061407 +0000 UTC m=+151.328083845" watchObservedRunningTime="2025-12-15 13:56:29.522530665 +0000 UTC m=+151.374553103" Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.542035 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:29 crc kubenswrapper[4794]: E1215 13:56:29.542449 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:30.04243075 +0000 UTC m=+151.894453188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.566308 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" podStartSLOduration=130.56627576099999 podStartE2EDuration="2m10.566275761s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:29.560943807 +0000 UTC m=+151.412966245" watchObservedRunningTime="2025-12-15 13:56:29.566275761 +0000 UTC m=+151.418298199" Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.658377 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:29 crc kubenswrapper[4794]: E1215 13:56:29.658705 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:30.158695824 +0000 UTC m=+152.010718262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.673047 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q"] Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.682567 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz"] Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.719539 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.728220 4794 patch_prober.go:28] interesting pod/router-default-5444994796-wd9tc container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.728267 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wd9tc" podUID="06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.736969 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pdzrx"] Dec 15 13:56:29 crc kubenswrapper[4794]: W1215 13:56:29.749644 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49c74c66_25d9_4f7c_a9d0_8d17b8387aa8.slice/crio-ec7ed04a6383b6baa271d1ddff9bb762daa0f67ee11f9093ecb24023d227d66b WatchSource:0}: Error finding container ec7ed04a6383b6baa271d1ddff9bb762daa0f67ee11f9093ecb24023d227d66b: Status 404 returned error can't find the container with id ec7ed04a6383b6baa271d1ddff9bb762daa0f67ee11f9093ecb24023d227d66b Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.749692 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5"] Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.758884 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:29 crc kubenswrapper[4794]: E1215 13:56:29.759161 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:30.259147162 +0000 UTC m=+152.111169600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.759182 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh"] Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.791975 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vvw2l"] Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.860357 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:29 crc kubenswrapper[4794]: E1215 13:56:29.861959 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:30.361940254 +0000 UTC m=+152.213962682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.933775 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt"] Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.970642 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:29 crc kubenswrapper[4794]: E1215 13:56:29.970979 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:30.470964363 +0000 UTC m=+152.322986801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:29 crc kubenswrapper[4794]: I1215 13:56:29.988463 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.005636 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.034621 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" podStartSLOduration=131.034601483 podStartE2EDuration="2m11.034601483s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:30.020373451 +0000 UTC m=+151.872395889" watchObservedRunningTime="2025-12-15 13:56:30.034601483 +0000 UTC m=+151.886623921" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.046603 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kpl54"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.053241 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.058767 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.078801 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:30 crc kubenswrapper[4794]: E1215 13:56:30.079144 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:30.579130929 +0000 UTC m=+152.431153357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:30 crc kubenswrapper[4794]: W1215 13:56:30.085725 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f186724_4abc_41b9_9e11_c40588de02ea.slice/crio-6206748fe37dd3c574a4694db3b4780eb0343118c4504198fed23e9c762a2d86 WatchSource:0}: Error finding container 6206748fe37dd3c574a4694db3b4780eb0343118c4504198fed23e9c762a2d86: Status 404 returned error can't find the container with id 6206748fe37dd3c574a4694db3b4780eb0343118c4504198fed23e9c762a2d86 Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.092863 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" podStartSLOduration=132.092839418 podStartE2EDuration="2m12.092839418s" podCreationTimestamp="2025-12-15 13:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:30.066319875 +0000 UTC m=+151.918342313" watchObservedRunningTime="2025-12-15 13:56:30.092839418 +0000 UTC m=+151.944861876" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.115786 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n7hgc"] Dec 15 13:56:30 crc kubenswrapper[4794]: W1215 13:56:30.125017 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e098c00_1b7c_4f84_bb6e_c2769a4f7a55.slice/crio-d97c2de1ea7556c2266c04f3eeb1e2073cff2a9b9ec775c9ea397766c2d405ec WatchSource:0}: Error finding container d97c2de1ea7556c2266c04f3eeb1e2073cff2a9b9ec775c9ea397766c2d405ec: Status 404 returned error can't find the container with id d97c2de1ea7556c2266c04f3eeb1e2073cff2a9b9ec775c9ea397766c2d405ec Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.127266 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.142038 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jpcm8" event={"ID":"77efa6c3-7ad2-40d0-83f6-17db5e5c5c90","Type":"ContainerStarted","Data":"6687c619df35f5920b9dec87ac111dc26c52e5602ed88964b9a2aa93cec6ec88"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.145894 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-wd9tc" podStartSLOduration=131.145874192 podStartE2EDuration="2m11.145874192s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:30.142183273 +0000 UTC m=+151.994205711" watchObservedRunningTime="2025-12-15 13:56:30.145874192 +0000 UTC m=+151.997896630" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.166452 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rfx4t" event={"ID":"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33","Type":"ContainerStarted","Data":"575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.174326 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.185277 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:30 crc kubenswrapper[4794]: E1215 13:56:30.185632 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:30.68560689 +0000 UTC m=+152.537629398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.187097 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" event={"ID":"49c74c66-25d9-4f7c-a9d0-8d17b8387aa8","Type":"ContainerStarted","Data":"ec7ed04a6383b6baa271d1ddff9bb762daa0f67ee11f9093ecb24023d227d66b"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.201643 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5qmc9"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.214344 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-pdnn8" podStartSLOduration=132.214323101 podStartE2EDuration="2m12.214323101s" podCreationTimestamp="2025-12-15 13:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:30.209231835 +0000 UTC m=+152.061254283" watchObservedRunningTime="2025-12-15 13:56:30.214323101 +0000 UTC m=+152.066345539" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.214575 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5m27d"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.245548 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.246313 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" event={"ID":"acf556c6-8a5a-4980-b07d-28939b2246ee","Type":"ContainerStarted","Data":"a383e1997c9c1b866fa97ddbb48fc6f63acdd248402e51ccf9f1dee604f0f415"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.270365 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dwkxc" podStartSLOduration=131.270345967 podStartE2EDuration="2m11.270345967s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:30.262065304 +0000 UTC m=+152.114087752" watchObservedRunningTime="2025-12-15 13:56:30.270345967 +0000 UTC m=+152.122368405" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.287476 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:30 crc kubenswrapper[4794]: E1215 13:56:30.291908 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:30.791890716 +0000 UTC m=+152.643913154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.304062 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" event={"ID":"2d01d215-bf1a-4d62-8584-d15315cb4675","Type":"ContainerStarted","Data":"06cb7bdca4e0aba0b5a0954f7b7bc39c1c8086f1d812011f142a4d25ccf86ce3"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.304181 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" event={"ID":"2d01d215-bf1a-4d62-8584-d15315cb4675","Type":"ContainerStarted","Data":"e4feef2b6e5a830bb0b3fdecc038865b1bdaf466d2fdd08a180b315d48fef67a"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.326818 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" event={"ID":"abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1","Type":"ContainerStarted","Data":"6f123438b1b3ff0bce0eecb86fc77fd45bf0ffa2ad16df23e4c330266dcf3d49"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.328902 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tvkw5"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.330308 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rfx4t" podStartSLOduration=132.330293277 podStartE2EDuration="2m12.330293277s" podCreationTimestamp="2025-12-15 13:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:30.322924909 +0000 UTC m=+152.174947347" watchObservedRunningTime="2025-12-15 13:56:30.330293277 +0000 UTC m=+152.182315715" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.356810 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5" event={"ID":"9777bea8-2829-45ff-85c1-69f25ff7f5ce","Type":"ContainerStarted","Data":"897f10acc8e72c2019abd6cbda787a430177dfdfe3cb7016b2df7ff72c94b8bb"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.356856 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh" event={"ID":"15605bda-e3da-46db-8c0d-9ebfadab8bbb","Type":"ContainerStarted","Data":"fd5fe94eda2504541d62a835f1f10b86a6bb853fcf547eb713abd8b935ede4fa"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.356868 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" event={"ID":"da9394b5-19ce-4482-b107-9339d9813a25","Type":"ContainerStarted","Data":"a5c71ffd93af086dc7ba8566d0ae9a7b78259c84ceca829225250d81426ce1ff"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.371647 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-59wlr"] Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.384848 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pdzrx" event={"ID":"8900322d-6a30-4b78-807c-ef710a33d219","Type":"ContainerStarted","Data":"78cf357fa35446c31650b020dce8a26570806db3a10d57ed6b7bb2251917ae40"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.389407 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:30 crc kubenswrapper[4794]: E1215 13:56:30.389677 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:30.889644792 +0000 UTC m=+152.741667230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.389706 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-glkfs" podStartSLOduration=131.389694533 podStartE2EDuration="2m11.389694533s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:30.368986907 +0000 UTC m=+152.221009365" watchObservedRunningTime="2025-12-15 13:56:30.389694533 +0000 UTC m=+152.241716971" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.389970 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:30 crc kubenswrapper[4794]: E1215 13:56:30.391839 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:30.89182674 +0000 UTC m=+152.743849258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.410305 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" podStartSLOduration=133.410287546 podStartE2EDuration="2m13.410287546s" podCreationTimestamp="2025-12-15 13:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:30.408099658 +0000 UTC m=+152.260122086" watchObservedRunningTime="2025-12-15 13:56:30.410287546 +0000 UTC m=+152.262309984" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.420820 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" event={"ID":"c8d46051-64c9-4b63-b3de-c50f8943a37c","Type":"ContainerStarted","Data":"3a112de60dd2eb12c8a982348d7fabc6cce838bb99def262e918489bfe4981fc"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.420862 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" event={"ID":"c8d46051-64c9-4b63-b3de-c50f8943a37c","Type":"ContainerStarted","Data":"06d2505921161295a7e859f8b40496a32266b15dffe23dba8bbf28a13297174c"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.424706 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" event={"ID":"d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4","Type":"ContainerStarted","Data":"ead57cdb8e095669dd52a349c611e913e1b1bd7e45b7dcb369dcd0fe3b206a3c"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.425945 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" event={"ID":"68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4","Type":"ContainerStarted","Data":"702137cad3b2e0101ebb8a95ff970d83f27bf9c19d3f8db3b20d41afe560e21f"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.425971 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" event={"ID":"68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4","Type":"ContainerStarted","Data":"573299ae360944ecff9e716089027a2785e4d2e624d24f45bced2b8c984225ae"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.425980 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" event={"ID":"68a2b8f6-4ed6-4592-a4c1-7ba29357ffc4","Type":"ContainerStarted","Data":"4f4a31e4ca3e37f606b2ca2c2da3096f2dd15a7c47d40893765944ae54f987b8"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.447871 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-jpcm8" podStartSLOduration=6.447848976 podStartE2EDuration="6.447848976s" podCreationTimestamp="2025-12-15 13:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:30.439541122 +0000 UTC m=+152.291563560" watchObservedRunningTime="2025-12-15 13:56:30.447848976 +0000 UTC m=+152.299871414" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.474705 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" event={"ID":"08faaabe-18ef-488b-b8d9-d4709136bcd6","Type":"ContainerStarted","Data":"89607cd1cef0f85bcae21e168b8b51ba94a1cf74e53479e21e8b018e40adc3bb"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.474737 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" event={"ID":"08faaabe-18ef-488b-b8d9-d4709136bcd6","Type":"ContainerStarted","Data":"3330aa61c9b29121494303f9cdf7cf0d2038b1e9a5e07aa91943e8bdf928c766"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.474746 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" event={"ID":"08faaabe-18ef-488b-b8d9-d4709136bcd6","Type":"ContainerStarted","Data":"1489754a46bc20bbac511f5e1420e9a0c5985bdfcdf0d779f9ed521afddd2fc2"} Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.492138 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:30 crc kubenswrapper[4794]: E1215 13:56:30.493152 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:30.993129102 +0000 UTC m=+152.845151540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.543117 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bpkkk" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.596004 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:30 crc kubenswrapper[4794]: E1215 13:56:30.604983 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:31.104969517 +0000 UTC m=+152.956991955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.611906 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhcln" podStartSLOduration=132.611888503 podStartE2EDuration="2m12.611888503s" podCreationTimestamp="2025-12-15 13:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:30.594320631 +0000 UTC m=+152.446343079" watchObservedRunningTime="2025-12-15 13:56:30.611888503 +0000 UTC m=+152.463910961" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.696798 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:30 crc kubenswrapper[4794]: E1215 13:56:30.697405 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:31.1973897 +0000 UTC m=+153.049412138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.725649 4794 patch_prober.go:28] interesting pod/router-default-5444994796-wd9tc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 15 13:56:30 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Dec 15 13:56:30 crc kubenswrapper[4794]: [+]process-running ok Dec 15 13:56:30 crc kubenswrapper[4794]: healthz check failed Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.725698 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wd9tc" podUID="06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.799255 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:30 crc kubenswrapper[4794]: E1215 13:56:30.802200 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:31.302170105 +0000 UTC m=+153.154192543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.859799 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-pdnn8" Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.900202 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:30 crc kubenswrapper[4794]: E1215 13:56:30.900397 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:31.400375423 +0000 UTC m=+153.252397861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.900683 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:30 crc kubenswrapper[4794]: E1215 13:56:30.900970 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:31.400962899 +0000 UTC m=+153.252985337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:30 crc kubenswrapper[4794]: I1215 13:56:30.988688 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-shzgd" podStartSLOduration=131.988669906 podStartE2EDuration="2m11.988669906s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:30.986698463 +0000 UTC m=+152.838720901" watchObservedRunningTime="2025-12-15 13:56:30.988669906 +0000 UTC m=+152.840692344" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.002242 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:31 crc kubenswrapper[4794]: E1215 13:56:31.002661 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:31.502643081 +0000 UTC m=+153.354665519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.103755 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:31 crc kubenswrapper[4794]: E1215 13:56:31.104073 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:31.604055936 +0000 UTC m=+153.456078374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.205850 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:31 crc kubenswrapper[4794]: E1215 13:56:31.206214 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:31.706188549 +0000 UTC m=+153.558210987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.206263 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:31 crc kubenswrapper[4794]: E1215 13:56:31.206801 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:31.706792535 +0000 UTC m=+153.558814973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.308014 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:31 crc kubenswrapper[4794]: E1215 13:56:31.308358 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:31.808340813 +0000 UTC m=+153.660363251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.309846 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rx78x" podStartSLOduration=132.309826853 podStartE2EDuration="2m12.309826853s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.308400525 +0000 UTC m=+153.160422973" watchObservedRunningTime="2025-12-15 13:56:31.309826853 +0000 UTC m=+153.161849291" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.317868 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.318224 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.345358 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.375680 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.377844 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.409480 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:31 crc kubenswrapper[4794]: E1215 13:56:31.409864 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:31.90985064 +0000 UTC m=+153.761873078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.494888 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" event={"ID":"49c74c66-25d9-4f7c-a9d0-8d17b8387aa8","Type":"ContainerStarted","Data":"23a48da563085a585c6a97d670911387bb01c04cb0dba0f182293de6a5d23656"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.497372 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5m27d" event={"ID":"6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d","Type":"ContainerStarted","Data":"7de9716ea515c5b09dab1ba16f1f856f85a28fb945d51ba34d2e6ca5b5fd8d51"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.497399 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5m27d" event={"ID":"6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d","Type":"ContainerStarted","Data":"1db79320e6a982b3aae88d875adc4c7f0ccb3e561904a783bd3882c98d4820af"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.504667 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" event={"ID":"3f106ed7-7479-46b8-ab6f-e6b291078caa","Type":"ContainerStarted","Data":"4ddf5cfcafffb77f0c816ff299f6cd1e2c95480c6fc93183d9fd02b367a9c42f"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.511073 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:31 crc kubenswrapper[4794]: E1215 13:56:31.511500 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:32.011485581 +0000 UTC m=+153.863508019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.518658 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-86n6q" podStartSLOduration=132.518637413 podStartE2EDuration="2m12.518637413s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.515831928 +0000 UTC m=+153.367854366" watchObservedRunningTime="2025-12-15 13:56:31.518637413 +0000 UTC m=+153.370659851" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.548011 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" event={"ID":"d502ead2-f4b7-4a34-bd18-6fc872cb30c2","Type":"ContainerStarted","Data":"dc7b550f9c643de1cebbec0fa9998711c97b850da26f83ea037382808c6d9c1b"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.548056 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" event={"ID":"d502ead2-f4b7-4a34-bd18-6fc872cb30c2","Type":"ContainerStarted","Data":"830a09b3f081b076260ed9828989055141e1dd3df85775b2250e975c65400e24"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.548580 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.555432 4794 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tvkw5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.555484 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" podUID="d502ead2-f4b7-4a34-bd18-6fc872cb30c2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.556688 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" event={"ID":"cf38d336-cbd8-4f99-9aa0-4a826979456c","Type":"ContainerStarted","Data":"00ea0088dba76d2148b70c4aa2149447b0704ee5c359c6f8e4994ca283e59b75"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.557884 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" event={"ID":"d9a6a61b-d9c0-4e9f-b72e-bb8f17112bb4","Type":"ContainerStarted","Data":"4541d7846f2b4bcb0c42178d528cd6325c1c62a30c912c1c7ec5af9ea31cfbd5"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.567102 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" podStartSLOduration=132.567085765 podStartE2EDuration="2m12.567085765s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.56468582 +0000 UTC m=+153.416708258" watchObservedRunningTime="2025-12-15 13:56:31.567085765 +0000 UTC m=+153.419108203" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.583572 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rfqgz" podStartSLOduration=132.583552937 podStartE2EDuration="2m12.583552937s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.583448904 +0000 UTC m=+153.435471342" watchObservedRunningTime="2025-12-15 13:56:31.583552937 +0000 UTC m=+153.435575375" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.612146 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:31 crc kubenswrapper[4794]: E1215 13:56:31.612914 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:32.112894505 +0000 UTC m=+153.964916943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.628117 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" event={"ID":"2f186724-4abc-41b9-9e11-c40588de02ea","Type":"ContainerStarted","Data":"80c2379e9211f911d53f429ccbac292f97486396f3ceb5e838dc91c9a54abe2a"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.628172 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" event={"ID":"2f186724-4abc-41b9-9e11-c40588de02ea","Type":"ContainerStarted","Data":"6206748fe37dd3c574a4694db3b4780eb0343118c4504198fed23e9c762a2d86"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.649532 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" event={"ID":"fb1b4e69-9921-4aef-b791-5408f51c4f71","Type":"ContainerStarted","Data":"57246bdbb7dc4e594848177ef5e90a5b2cc1116388af15eb022b75a43d0b0212"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.649876 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.649935 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cxtdf" podStartSLOduration=132.64991728 podStartE2EDuration="2m12.64991728s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.649438737 +0000 UTC m=+153.501461185" watchObservedRunningTime="2025-12-15 13:56:31.64991728 +0000 UTC m=+153.501939718" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.656533 4794 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-682gk container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.656575 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" podUID="fb1b4e69-9921-4aef-b791-5408f51c4f71" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.663341 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh" event={"ID":"15605bda-e3da-46db-8c0d-9ebfadab8bbb","Type":"ContainerStarted","Data":"ee5a9666cca898dfeaae74b658fe6ac8a83fac07033321a28bd1039d49d3398a"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.708831 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kpl54" event={"ID":"39567a2d-ea06-4b63-ac4f-9615807a599c","Type":"ContainerStarted","Data":"ffb12750f82b52dd32c7e1035b92867c14971a4a435cea6a2dc334f53007ec3f"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.708894 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kpl54" event={"ID":"39567a2d-ea06-4b63-ac4f-9615807a599c","Type":"ContainerStarted","Data":"152d0d376d6b231fdde7e0a6ab8fb500928bf60f518fb1d8371f65805dc8c7fb"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.713316 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:31 crc kubenswrapper[4794]: E1215 13:56:31.714528 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:32.214509416 +0000 UTC m=+154.066531864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.718433 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" podStartSLOduration=132.718417151 podStartE2EDuration="2m12.718417151s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.686446602 +0000 UTC m=+153.538469040" watchObservedRunningTime="2025-12-15 13:56:31.718417151 +0000 UTC m=+153.570439599" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.725847 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" event={"ID":"abbd31bb-b5cd-481f-b3c2-cfb3586c0ff1","Type":"ContainerStarted","Data":"b0a7dcca98b58a7dff6073a01619c384718f4abea2a8e7b617cb993e367090be"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.737763 4794 patch_prober.go:28] interesting pod/router-default-5444994796-wd9tc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 15 13:56:31 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Dec 15 13:56:31 crc kubenswrapper[4794]: [+]process-running ok Dec 15 13:56:31 crc kubenswrapper[4794]: healthz check failed Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.737806 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wd9tc" podUID="06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.738109 4794 patch_prober.go:28] interesting pod/apiserver-76f77b778f-69q8p container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 15 13:56:31 crc kubenswrapper[4794]: [+]log ok Dec 15 13:56:31 crc kubenswrapper[4794]: [+]etcd ok Dec 15 13:56:31 crc kubenswrapper[4794]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 15 13:56:31 crc kubenswrapper[4794]: [+]poststarthook/generic-apiserver-start-informers ok Dec 15 13:56:31 crc kubenswrapper[4794]: [+]poststarthook/max-in-flight-filter ok Dec 15 13:56:31 crc kubenswrapper[4794]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 15 13:56:31 crc kubenswrapper[4794]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 15 13:56:31 crc kubenswrapper[4794]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 15 13:56:31 crc kubenswrapper[4794]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 15 13:56:31 crc kubenswrapper[4794]: [+]poststarthook/project.openshift.io-projectcache ok Dec 15 13:56:31 crc kubenswrapper[4794]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 15 13:56:31 crc kubenswrapper[4794]: [+]poststarthook/openshift.io-startinformers ok Dec 15 13:56:31 crc kubenswrapper[4794]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 15 13:56:31 crc kubenswrapper[4794]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 15 13:56:31 crc kubenswrapper[4794]: livez check failed Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.738172 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" podUID="acf556c6-8a5a-4980-b07d-28939b2246ee" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.751491 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" event={"ID":"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55","Type":"ContainerStarted","Data":"fa17fdece978efa58e7bb0e04ccb852129b0d1874328b3b52e042d34102e03db"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.751542 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" event={"ID":"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55","Type":"ContainerStarted","Data":"d97c2de1ea7556c2266c04f3eeb1e2073cff2a9b9ec775c9ea397766c2d405ec"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.770035 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pdzrx" event={"ID":"8900322d-6a30-4b78-807c-ef710a33d219","Type":"ContainerStarted","Data":"225a9a97f874200af6070bce497f243dbea9bd7ae0c1742fc5c782a48f74b2b3"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.785685 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vvw2l" podStartSLOduration=132.785666937 podStartE2EDuration="2m12.785666937s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.785198415 +0000 UTC m=+153.637220853" watchObservedRunningTime="2025-12-15 13:56:31.785666937 +0000 UTC m=+153.637689375" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.787503 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lvgzh" podStartSLOduration=132.787497196 podStartE2EDuration="2m12.787497196s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.724899075 +0000 UTC m=+153.576921513" watchObservedRunningTime="2025-12-15 13:56:31.787497196 +0000 UTC m=+153.639519634" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.821000 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:31 crc kubenswrapper[4794]: E1215 13:56:31.822467 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:32.322454696 +0000 UTC m=+154.174477124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.826841 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" event={"ID":"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e","Type":"ContainerStarted","Data":"abece55c9a749021cdc779bd4db421d6204b783e66d61e0012d9ef3408c1b904"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.826876 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" event={"ID":"f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e","Type":"ContainerStarted","Data":"27878242b396a50bffd2811df2bd98c1060bbcb4a408230a732e5171f62152cb"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.827539 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.864206 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" event={"ID":"da9394b5-19ce-4482-b107-9339d9813a25","Type":"ContainerStarted","Data":"122c37339c404c5df3288e8575a3bfb34a8ce77160701718e71e9a9d6f2177bc"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.889623 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" event={"ID":"03a11d34-91da-442e-a26c-a7a5453374ab","Type":"ContainerStarted","Data":"7de4b897d1c5bb8f0a8b97862d771342abf52e95e7e1ecf7edc19977b6546ce7"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.889668 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" event={"ID":"03a11d34-91da-442e-a26c-a7a5453374ab","Type":"ContainerStarted","Data":"9cab24e88fd04fce89fd98433864ecccdf3fa93b17d5af199ae1d831ef0b7c21"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.890396 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.895092 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pdzrx" podStartSLOduration=7.895078487 podStartE2EDuration="7.895078487s" podCreationTimestamp="2025-12-15 13:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.842934236 +0000 UTC m=+153.694956684" watchObservedRunningTime="2025-12-15 13:56:31.895078487 +0000 UTC m=+153.747100925" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.900629 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" podStartSLOduration=132.900605055 podStartE2EDuration="2m12.900605055s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.894848931 +0000 UTC m=+153.746871379" watchObservedRunningTime="2025-12-15 13:56:31.900605055 +0000 UTC m=+153.752627493" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.932145 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.932760 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:31 crc kubenswrapper[4794]: E1215 13:56:31.933710 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:32.433690964 +0000 UTC m=+154.285713402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.936660 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" event={"ID":"308314c2-75ff-4d16-9c99-15574bb6a3d8","Type":"ContainerStarted","Data":"7273db05492d9e7d3ed5bf52461cc692074469f3fdb5cb17b819fd1ce27797fd"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.945057 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bbg49" podStartSLOduration=132.945037679 podStartE2EDuration="2m12.945037679s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.936380827 +0000 UTC m=+153.788403265" watchObservedRunningTime="2025-12-15 13:56:31.945037679 +0000 UTC m=+153.797060117" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.962243 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" event={"ID":"854fdfac-41a5-473c-b955-95b2ecae9856","Type":"ContainerStarted","Data":"f6c792a6ac450704ffc565fae9cb631224020ebfe44b85aa8597e35e0ecaa0b5"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.962639 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" podStartSLOduration=133.962621992 podStartE2EDuration="2m13.962621992s" podCreationTimestamp="2025-12-15 13:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:31.962475338 +0000 UTC m=+153.814497786" watchObservedRunningTime="2025-12-15 13:56:31.962621992 +0000 UTC m=+153.814644430" Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.979707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" event={"ID":"483b4762-71e8-4fb2-9d6a-2daa57a596df","Type":"ContainerStarted","Data":"b3b19ed6d9fc6ab1981480a8efaa2fb0ee79a9ee3cfa83fa13b12689b2af72a4"} Dec 15 13:56:31 crc kubenswrapper[4794]: I1215 13:56:31.979766 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" event={"ID":"483b4762-71e8-4fb2-9d6a-2daa57a596df","Type":"ContainerStarted","Data":"43764c2b4b134909691c75093f308526e7d2c77558e9478ddd49a24fdbb4443a"} Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.015889 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5" event={"ID":"9777bea8-2829-45ff-85c1-69f25ff7f5ce","Type":"ContainerStarted","Data":"160bc6e2c0ab895005fa9ed79ad5dfea80ca40e7e65fc9c4287b44c3af0cc371"} Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.015952 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5" event={"ID":"9777bea8-2829-45ff-85c1-69f25ff7f5ce","Type":"ContainerStarted","Data":"a95c1f8474a4af7f99629eea0caba8d481ac67895b4cac50bad2f2a27d34ce30"} Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.030846 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnkmj" Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.043338 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.044443 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:32.544426009 +0000 UTC m=+154.396448447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.046254 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-59wlr" podStartSLOduration=133.046238938 podStartE2EDuration="2m13.046238938s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:32.044535142 +0000 UTC m=+153.896557580" watchObservedRunningTime="2025-12-15 13:56:32.046238938 +0000 UTC m=+153.898261386" Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.144933 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.147131 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:32.647109198 +0000 UTC m=+154.499131636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.246738 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.247340 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:32.747324541 +0000 UTC m=+154.599346989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.348846 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.349077 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:32.849034053 +0000 UTC m=+154.701056491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.349118 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.349610 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:32.849599018 +0000 UTC m=+154.701621456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.450283 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.450792 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:32.950760926 +0000 UTC m=+154.802783374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.551901 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.552321 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.052302524 +0000 UTC m=+154.904325002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.653052 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.653269 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.153247816 +0000 UTC m=+155.005270254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.653611 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.653934 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.153925595 +0000 UTC m=+155.005948033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.717191 4794 patch_prober.go:28] interesting pod/router-default-5444994796-wd9tc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 15 13:56:32 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Dec 15 13:56:32 crc kubenswrapper[4794]: [+]process-running ok Dec 15 13:56:32 crc kubenswrapper[4794]: healthz check failed Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.717247 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wd9tc" podUID="06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.754085 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.754482 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.254467306 +0000 UTC m=+155.106489744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.827652 4794 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-nbcfc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.827720 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" podUID="f2cd6d49-cd12-4c9c-b39f-ad25cd083b3e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.855742 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.856098 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.356069565 +0000 UTC m=+155.208092003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.956949 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.957112 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.457077759 +0000 UTC m=+155.309100207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:32 crc kubenswrapper[4794]: I1215 13:56:32.957247 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:32 crc kubenswrapper[4794]: E1215 13:56:32.957526 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.457512751 +0000 UTC m=+155.309535189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.021312 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" event={"ID":"fb1b4e69-9921-4aef-b791-5408f51c4f71","Type":"ContainerStarted","Data":"498f67c976db98e4de4c4a913878d42b6e67903fdf918ea30710f39d3c77219a"} Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.023865 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" event={"ID":"cf38d336-cbd8-4f99-9aa0-4a826979456c","Type":"ContainerStarted","Data":"dcb038d8b25c1e0b8971d55aa015452bf9f1bbcea4ae2a4b0d250fb6d10fa5f8"} Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.023906 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" event={"ID":"cf38d336-cbd8-4f99-9aa0-4a826979456c","Type":"ContainerStarted","Data":"634200682a93e400b415e34eed953af19af294cb2a856c64308d02fb30f7fba9"} Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.025047 4794 generic.go:334] "Generic (PLEG): container finished" podID="da9394b5-19ce-4482-b107-9339d9813a25" containerID="122c37339c404c5df3288e8575a3bfb34a8ce77160701718e71e9a9d6f2177bc" exitCode=0 Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.025099 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" event={"ID":"da9394b5-19ce-4482-b107-9339d9813a25","Type":"ContainerDied","Data":"122c37339c404c5df3288e8575a3bfb34a8ce77160701718e71e9a9d6f2177bc"} Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.026219 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-682gk" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.026588 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" event={"ID":"308314c2-75ff-4d16-9c99-15574bb6a3d8","Type":"ContainerStarted","Data":"52236e57a864d362a93ed264e2cd54419de48c4271af942237d03fae06b71501"} Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.026612 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" event={"ID":"308314c2-75ff-4d16-9c99-15574bb6a3d8","Type":"ContainerStarted","Data":"b46b08c79ff4b91772c11eda51ee38a43e215a044698e90a88835c794c8bb401"} Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.028595 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kpl54" event={"ID":"39567a2d-ea06-4b63-ac4f-9615807a599c","Type":"ContainerStarted","Data":"7d41e685c865ccca885da78fa77950065abd1bf763e945aa9f58d087976a9910"} Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.028707 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.030090 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" event={"ID":"3f106ed7-7479-46b8-ab6f-e6b291078caa","Type":"ContainerStarted","Data":"48acb8f078d377b613cec1ede6179e654bdf8473633e36819f1d757b464620cb"} Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.031468 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" event={"ID":"854fdfac-41a5-473c-b955-95b2ecae9856","Type":"ContainerStarted","Data":"758fa052b035019cc8af3973c6bc9702161fcfd090587112f09eef91afdd6bb4"} Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.031494 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" event={"ID":"854fdfac-41a5-473c-b955-95b2ecae9856","Type":"ContainerStarted","Data":"f2aa3707831a9c062ec4f6cd7ba0fae53307d0847066b1a6fc0c17e0187a0569"} Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.031893 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.033700 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" event={"ID":"6e098c00-1b7c-4f84-bb6e-c2769a4f7a55","Type":"ContainerStarted","Data":"491ad81dbde6f9624023a9147ff28af84fe671178926d030f027bdfbcfa06625"} Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.034837 4794 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tvkw5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.034868 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" podUID="d502ead2-f4b7-4a34-bd18-6fc872cb30c2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.038001 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mvg8j" podStartSLOduration=134.037973013 podStartE2EDuration="2m14.037973013s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:33.036561275 +0000 UTC m=+154.888583713" watchObservedRunningTime="2025-12-15 13:56:33.037973013 +0000 UTC m=+154.889995451" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.040416 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8pfs5" podStartSLOduration=134.040410678 podStartE2EDuration="2m14.040410678s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:32.123906575 +0000 UTC m=+153.975929033" watchObservedRunningTime="2025-12-15 13:56:33.040410678 +0000 UTC m=+154.892433116" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.059229 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.059321 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.559301296 +0000 UTC m=+155.411323744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.068407 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.068982 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.568960145 +0000 UTC m=+155.420982583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.106572 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qmc9" podStartSLOduration=134.106556495 podStartE2EDuration="2m14.106556495s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:33.078727788 +0000 UTC m=+154.930750246" watchObservedRunningTime="2025-12-15 13:56:33.106556495 +0000 UTC m=+154.958578933" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.107969 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5m27d" podStartSLOduration=134.107964483 podStartE2EDuration="2m14.107964483s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:33.10561956 +0000 UTC m=+154.957641998" watchObservedRunningTime="2025-12-15 13:56:33.107964483 +0000 UTC m=+154.959986921" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.169602 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.169780 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.669751913 +0000 UTC m=+155.521774351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.169937 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.170239 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.670232376 +0000 UTC m=+155.522254814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.174655 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" podStartSLOduration=134.174643555 podStartE2EDuration="2m14.174643555s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:33.164029189 +0000 UTC m=+155.016051627" watchObservedRunningTime="2025-12-15 13:56:33.174643555 +0000 UTC m=+155.026665993" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.184845 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4vc5z" podStartSLOduration=134.184826498 podStartE2EDuration="2m14.184826498s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:33.184131499 +0000 UTC m=+155.036153937" watchObservedRunningTime="2025-12-15 13:56:33.184826498 +0000 UTC m=+155.036848936" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.199045 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kpl54" podStartSLOduration=9.19903065 podStartE2EDuration="9.19903065s" podCreationTimestamp="2025-12-15 13:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:33.197041916 +0000 UTC m=+155.049064354" watchObservedRunningTime="2025-12-15 13:56:33.19903065 +0000 UTC m=+155.051053088" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.271250 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.271441 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.771415045 +0000 UTC m=+155.623437483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.319066 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nbcfc" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.372928 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.373233 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.87322253 +0000 UTC m=+155.725244958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.474293 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.474481 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.97445624 +0000 UTC m=+155.826478678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.474610 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.474930 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:33.974921712 +0000 UTC m=+155.826944150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.575912 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.576070 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.076046479 +0000 UTC m=+155.928068917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.576150 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.576483 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.0764731 +0000 UTC m=+155.928495608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.678068 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.678223 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.178201713 +0000 UTC m=+156.030224161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.678335 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.678685 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.178675686 +0000 UTC m=+156.030698124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.717519 4794 patch_prober.go:28] interesting pod/router-default-5444994796-wd9tc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 15 13:56:33 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Dec 15 13:56:33 crc kubenswrapper[4794]: [+]process-running ok Dec 15 13:56:33 crc kubenswrapper[4794]: healthz check failed Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.717901 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wd9tc" podUID="06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.779359 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.779520 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.279494745 +0000 UTC m=+156.131517193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.779783 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.780158 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.280141512 +0000 UTC m=+156.132163950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.880466 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.880613 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.38056927 +0000 UTC m=+156.232591708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.880800 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.881119 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.381104475 +0000 UTC m=+156.233126903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.982029 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.982232 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.482208301 +0000 UTC m=+156.334230739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:33 crc kubenswrapper[4794]: I1215 13:56:33.982490 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:33 crc kubenswrapper[4794]: E1215 13:56:33.982872 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.482855409 +0000 UTC m=+156.334877847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.040017 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" event={"ID":"3f106ed7-7479-46b8-ab6f-e6b291078caa","Type":"ContainerStarted","Data":"a909f868ca6ac7ea075fed8ef6bb8f086615363b5b16ad1a8f1f628a817fcb39"} Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.049486 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.083422 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.084329 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.584312874 +0000 UTC m=+156.436335312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.172717 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hj6qt"] Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.175974 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.179163 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.184701 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.185285 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.685270637 +0000 UTC m=+156.537293075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.185881 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hj6qt"] Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.257440 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.285992 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da9394b5-19ce-4482-b107-9339d9813a25-config-volume\") pod \"da9394b5-19ce-4482-b107-9339d9813a25\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.286042 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da9394b5-19ce-4482-b107-9339d9813a25-secret-volume\") pod \"da9394b5-19ce-4482-b107-9339d9813a25\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.286091 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfqpw\" (UniqueName: \"kubernetes.io/projected/da9394b5-19ce-4482-b107-9339d9813a25-kube-api-access-mfqpw\") pod \"da9394b5-19ce-4482-b107-9339d9813a25\" (UID: \"da9394b5-19ce-4482-b107-9339d9813a25\") " Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.286260 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.286343 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.786322492 +0000 UTC m=+156.638344930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.286386 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-catalog-content\") pod \"certified-operators-hj6qt\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.286428 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.286447 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnlzz\" (UniqueName: \"kubernetes.io/projected/e64ae040-f93d-4dcc-8dce-812b127b0630-kube-api-access-pnlzz\") pod \"certified-operators-hj6qt\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.286456 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9394b5-19ce-4482-b107-9339d9813a25-config-volume" (OuterVolumeSpecName: "config-volume") pod "da9394b5-19ce-4482-b107-9339d9813a25" (UID: "da9394b5-19ce-4482-b107-9339d9813a25"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.286477 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-utilities\") pod \"certified-operators-hj6qt\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.286713 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.786705182 +0000 UTC m=+156.638727620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.301651 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9394b5-19ce-4482-b107-9339d9813a25-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da9394b5-19ce-4482-b107-9339d9813a25" (UID: "da9394b5-19ce-4482-b107-9339d9813a25"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.305394 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9394b5-19ce-4482-b107-9339d9813a25-kube-api-access-mfqpw" (OuterVolumeSpecName: "kube-api-access-mfqpw") pod "da9394b5-19ce-4482-b107-9339d9813a25" (UID: "da9394b5-19ce-4482-b107-9339d9813a25"). InnerVolumeSpecName "kube-api-access-mfqpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.350169 4794 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.364918 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2lkl6"] Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.365266 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9394b5-19ce-4482-b107-9339d9813a25" containerName="collect-profiles" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.365293 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9394b5-19ce-4482-b107-9339d9813a25" containerName="collect-profiles" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.365462 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9394b5-19ce-4482-b107-9339d9813a25" containerName="collect-profiles" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.366655 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.370961 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2lkl6"] Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.371414 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.387399 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.387640 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-utilities\") pod \"certified-operators-hj6qt\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.387767 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.887733596 +0000 UTC m=+156.739756074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.387840 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-catalog-content\") pod \"certified-operators-hj6qt\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.387957 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8qm2\" (UniqueName: \"kubernetes.io/projected/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-kube-api-access-f8qm2\") pod \"community-operators-2lkl6\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.388008 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-utilities\") pod \"community-operators-2lkl6\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.388080 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.388193 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnlzz\" (UniqueName: \"kubernetes.io/projected/e64ae040-f93d-4dcc-8dce-812b127b0630-kube-api-access-pnlzz\") pod \"certified-operators-hj6qt\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.388223 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-utilities\") pod \"certified-operators-hj6qt\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.388252 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-catalog-content\") pod \"community-operators-2lkl6\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.388369 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-catalog-content\") pod \"certified-operators-hj6qt\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.388384 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfqpw\" (UniqueName: \"kubernetes.io/projected/da9394b5-19ce-4482-b107-9339d9813a25-kube-api-access-mfqpw\") on node \"crc\" DevicePath \"\"" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.388451 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da9394b5-19ce-4482-b107-9339d9813a25-config-volume\") on node \"crc\" DevicePath \"\"" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.388467 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da9394b5-19ce-4482-b107-9339d9813a25-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.394650 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.894634552 +0000 UTC m=+156.746656990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.413558 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnlzz\" (UniqueName: \"kubernetes.io/projected/e64ae040-f93d-4dcc-8dce-812b127b0630-kube-api-access-pnlzz\") pod \"certified-operators-hj6qt\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.489658 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.489902 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.98987024 +0000 UTC m=+156.841892738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.490016 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8qm2\" (UniqueName: \"kubernetes.io/projected/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-kube-api-access-f8qm2\") pod \"community-operators-2lkl6\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.490040 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-utilities\") pod \"community-operators-2lkl6\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.490083 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.490105 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-catalog-content\") pod \"community-operators-2lkl6\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.490400 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:34.990386944 +0000 UTC m=+156.842409382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.490501 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-catalog-content\") pod \"community-operators-2lkl6\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.490597 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-utilities\") pod \"community-operators-2lkl6\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.497086 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.506423 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8qm2\" (UniqueName: \"kubernetes.io/projected/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-kube-api-access-f8qm2\") pod \"community-operators-2lkl6\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.573464 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xb6h5"] Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.574565 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.606015 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.606355 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-catalog-content\") pod \"certified-operators-xb6h5\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.606428 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-utilities\") pod \"certified-operators-xb6h5\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.606500 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-868n4\" (UniqueName: \"kubernetes.io/projected/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-kube-api-access-868n4\") pod \"certified-operators-xb6h5\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.606717 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:35.106698849 +0000 UTC m=+156.958721287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.606760 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xb6h5"] Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.691885 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.709808 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.709914 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-catalog-content\") pod \"certified-operators-xb6h5\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.710022 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-utilities\") pod \"certified-operators-xb6h5\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.710144 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-868n4\" (UniqueName: \"kubernetes.io/projected/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-kube-api-access-868n4\") pod \"certified-operators-xb6h5\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.710698 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:35.210686593 +0000 UTC m=+157.062709031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.710774 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-catalog-content\") pod \"certified-operators-xb6h5\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.712235 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-utilities\") pod \"certified-operators-xb6h5\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.719975 4794 patch_prober.go:28] interesting pod/router-default-5444994796-wd9tc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 15 13:56:34 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Dec 15 13:56:34 crc kubenswrapper[4794]: [+]process-running ok Dec 15 13:56:34 crc kubenswrapper[4794]: healthz check failed Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.720117 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wd9tc" podUID="06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.721981 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hj6qt"] Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.726830 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-868n4\" (UniqueName: \"kubernetes.io/projected/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-kube-api-access-868n4\") pod \"certified-operators-xb6h5\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:56:34 crc kubenswrapper[4794]: W1215 13:56:34.737640 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode64ae040_f93d_4dcc_8dce_812b127b0630.slice/crio-40a7c8cab81b0c4b28211598fb878413ae6945369cb5c1948cdb74c75dbba010 WatchSource:0}: Error finding container 40a7c8cab81b0c4b28211598fb878413ae6945369cb5c1948cdb74c75dbba010: Status 404 returned error can't find the container with id 40a7c8cab81b0c4b28211598fb878413ae6945369cb5c1948cdb74c75dbba010 Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.771827 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-llmhn"] Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.773910 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.784962 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llmhn"] Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.810800 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.811031 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-catalog-content\") pod \"community-operators-llmhn\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.811069 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-utilities\") pod \"community-operators-llmhn\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.811132 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpsp7\" (UniqueName: \"kubernetes.io/projected/44b3d3fc-c9d0-4487-b176-26ef1f32de15-kube-api-access-fpsp7\") pod \"community-operators-llmhn\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.811332 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-15 13:56:35.311313195 +0000 UTC m=+157.163335633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.908921 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.912135 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.912167 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-catalog-content\") pod \"community-operators-llmhn\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.912188 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-utilities\") pod \"community-operators-llmhn\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.912215 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpsp7\" (UniqueName: \"kubernetes.io/projected/44b3d3fc-c9d0-4487-b176-26ef1f32de15-kube-api-access-fpsp7\") pod \"community-operators-llmhn\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:56:34 crc kubenswrapper[4794]: E1215 13:56:34.912740 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-15 13:56:35.41272276 +0000 UTC m=+157.264745198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q5mq8" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.912805 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-utilities\") pod \"community-operators-llmhn\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.912907 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-catalog-content\") pod \"community-operators-llmhn\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.935202 4794 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-15T13:56:34.350196068Z","Handler":null,"Name":""} Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.940869 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpsp7\" (UniqueName: \"kubernetes.io/projected/44b3d3fc-c9d0-4487-b176-26ef1f32de15-kube-api-access-fpsp7\") pod \"community-operators-llmhn\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.942453 4794 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 15 13:56:34 crc kubenswrapper[4794]: I1215 13:56:34.942492 4794 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.013308 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.022840 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.046803 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.046965 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt" event={"ID":"da9394b5-19ce-4482-b107-9339d9813a25","Type":"ContainerDied","Data":"a5c71ffd93af086dc7ba8566d0ae9a7b78259c84ceca829225250d81426ce1ff"} Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.047000 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5c71ffd93af086dc7ba8566d0ae9a7b78259c84ceca829225250d81426ce1ff" Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.067560 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" event={"ID":"3f106ed7-7479-46b8-ab6f-e6b291078caa","Type":"ContainerStarted","Data":"1ad3423dd56fb22d3415bdc84f1cc641debd8db99af45b943997de308ab6a8a9"} Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.069164 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj6qt" event={"ID":"e64ae040-f93d-4dcc-8dce-812b127b0630","Type":"ContainerStarted","Data":"40a7c8cab81b0c4b28211598fb878413ae6945369cb5c1948cdb74c75dbba010"} Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.114036 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xb6h5"] Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.114330 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.117668 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.117709 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:35 crc kubenswrapper[4794]: W1215 13:56:35.123503 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6610ca9_db3b_4fe8_80bb_9b35208e00b2.slice/crio-d36068d2b54dd0b845313fbf532625080e202ff7cc50050c9ba9e9b933a24fb3 WatchSource:0}: Error finding container d36068d2b54dd0b845313fbf532625080e202ff7cc50050c9ba9e9b933a24fb3: Status 404 returned error can't find the container with id d36068d2b54dd0b845313fbf532625080e202ff7cc50050c9ba9e9b933a24fb3 Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.160231 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2lkl6"] Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.175198 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q5mq8\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.175618 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.192069 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:35 crc kubenswrapper[4794]: W1215 13:56:35.194100 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e69f94d_9c18_4b19_8290_5e2d86ab4bae.slice/crio-ed136637b27af5e46288169708c7a63ac0b9b7e0286b2935d897c6b7710a44e4 WatchSource:0}: Error finding container ed136637b27af5e46288169708c7a63ac0b9b7e0286b2935d897c6b7710a44e4: Status 404 returned error can't find the container with id ed136637b27af5e46288169708c7a63ac0b9b7e0286b2935d897c6b7710a44e4 Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.368921 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llmhn"] Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.405642 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q5mq8"] Dec 15 13:56:35 crc kubenswrapper[4794]: W1215 13:56:35.456788 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc235fb96_c626_4949_bd6e_a21dd37bc9d1.slice/crio-9fd04e3f1cc0ccbdc1517cdac96318f7769744f5954692c2d952eb0b4fddb472 WatchSource:0}: Error finding container 9fd04e3f1cc0ccbdc1517cdac96318f7769744f5954692c2d952eb0b4fddb472: Status 404 returned error can't find the container with id 9fd04e3f1cc0ccbdc1517cdac96318f7769744f5954692c2d952eb0b4fddb472 Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.718244 4794 patch_prober.go:28] interesting pod/router-default-5444994796-wd9tc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 15 13:56:35 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Dec 15 13:56:35 crc kubenswrapper[4794]: [+]process-running ok Dec 15 13:56:35 crc kubenswrapper[4794]: healthz check failed Dec 15 13:56:35 crc kubenswrapper[4794]: I1215 13:56:35.718306 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wd9tc" podUID="06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.074909 4794 generic.go:334] "Generic (PLEG): container finished" podID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" containerID="2984ca1a127b4c0e3b592a3e787756adb5ffc8b281183dfe3b189412b697548b" exitCode=0 Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.074991 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb6h5" event={"ID":"f6610ca9-db3b-4fe8-80bb-9b35208e00b2","Type":"ContainerDied","Data":"2984ca1a127b4c0e3b592a3e787756adb5ffc8b281183dfe3b189412b697548b"} Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.075018 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb6h5" event={"ID":"f6610ca9-db3b-4fe8-80bb-9b35208e00b2","Type":"ContainerStarted","Data":"d36068d2b54dd0b845313fbf532625080e202ff7cc50050c9ba9e9b933a24fb3"} Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.076462 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.077494 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" event={"ID":"3f106ed7-7479-46b8-ab6f-e6b291078caa","Type":"ContainerStarted","Data":"30620882c7f4ed1050c94288e46a70eac7e981fa1a064bb6216bf0961aec7726"} Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.079450 4794 generic.go:334] "Generic (PLEG): container finished" podID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerID="fecdc119653ca9fca0ed5c178aee79821b6479eb5a1d78203505dcaefd165307" exitCode=0 Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.079517 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llmhn" event={"ID":"44b3d3fc-c9d0-4487-b176-26ef1f32de15","Type":"ContainerDied","Data":"fecdc119653ca9fca0ed5c178aee79821b6479eb5a1d78203505dcaefd165307"} Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.079537 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llmhn" event={"ID":"44b3d3fc-c9d0-4487-b176-26ef1f32de15","Type":"ContainerStarted","Data":"c107824738f43a5e1e1ec93d040b7964c3fe57d62384cd9167ceb0c8aa8065d4"} Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.081315 4794 generic.go:334] "Generic (PLEG): container finished" podID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" containerID="accfb95f38d2325c5fa01cd8726b2319341acbf5b134df565c0c0a63dd050a6a" exitCode=0 Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.081378 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lkl6" event={"ID":"4e69f94d-9c18-4b19-8290-5e2d86ab4bae","Type":"ContainerDied","Data":"accfb95f38d2325c5fa01cd8726b2319341acbf5b134df565c0c0a63dd050a6a"} Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.081403 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lkl6" event={"ID":"4e69f94d-9c18-4b19-8290-5e2d86ab4bae","Type":"ContainerStarted","Data":"ed136637b27af5e46288169708c7a63ac0b9b7e0286b2935d897c6b7710a44e4"} Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.083426 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" event={"ID":"c235fb96-c626-4949-bd6e-a21dd37bc9d1","Type":"ContainerStarted","Data":"15380e638f9143fff388d4cc74e8a21779d87a7bdc36831e2bb73929408d13b7"} Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.083448 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" event={"ID":"c235fb96-c626-4949-bd6e-a21dd37bc9d1","Type":"ContainerStarted","Data":"9fd04e3f1cc0ccbdc1517cdac96318f7769744f5954692c2d952eb0b4fddb472"} Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.085937 4794 generic.go:334] "Generic (PLEG): container finished" podID="e64ae040-f93d-4dcc-8dce-812b127b0630" containerID="c9229114feafcb8f1cb1f2bdc548a53a40bdc3afb091d50ffc8cfe9c3392f9be" exitCode=0 Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.085972 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj6qt" event={"ID":"e64ae040-f93d-4dcc-8dce-812b127b0630","Type":"ContainerDied","Data":"c9229114feafcb8f1cb1f2bdc548a53a40bdc3afb091d50ffc8cfe9c3392f9be"} Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.161344 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-n7hgc" podStartSLOduration=12.161324676 podStartE2EDuration="12.161324676s" podCreationTimestamp="2025-12-15 13:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:36.149973891 +0000 UTC m=+158.001996349" watchObservedRunningTime="2025-12-15 13:56:36.161324676 +0000 UTC m=+158.013347124" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.163272 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvh"] Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.164675 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.166646 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.185160 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvh"] Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.232692 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-catalog-content\") pod \"redhat-marketplace-5sjvh\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.232752 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-utilities\") pod \"redhat-marketplace-5sjvh\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.232893 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vm82\" (UniqueName: \"kubernetes.io/projected/9f11449c-f57e-46af-bd65-ed450d85ba60-kube-api-access-2vm82\") pod \"redhat-marketplace-5sjvh\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.333773 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-catalog-content\") pod \"redhat-marketplace-5sjvh\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.333834 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-utilities\") pod \"redhat-marketplace-5sjvh\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.333867 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vm82\" (UniqueName: \"kubernetes.io/projected/9f11449c-f57e-46af-bd65-ed450d85ba60-kube-api-access-2vm82\") pod \"redhat-marketplace-5sjvh\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.334334 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-catalog-content\") pod \"redhat-marketplace-5sjvh\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.334367 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-utilities\") pod \"redhat-marketplace-5sjvh\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.360531 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vm82\" (UniqueName: \"kubernetes.io/projected/9f11449c-f57e-46af-bd65-ed450d85ba60-kube-api-access-2vm82\") pod \"redhat-marketplace-5sjvh\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.377445 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.382366 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-69q8p" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.482088 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.559968 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6fxvm"] Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.560845 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.571470 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fxvm"] Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.590188 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.590788 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.592306 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.592842 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.595745 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.639271 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-utilities\") pod \"redhat-marketplace-6fxvm\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.639325 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d496b91-6a5b-449b-b23d-bc86f43361eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3d496b91-6a5b-449b-b23d-bc86f43361eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.639348 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-catalog-content\") pod \"redhat-marketplace-6fxvm\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.639367 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d496b91-6a5b-449b-b23d-bc86f43361eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3d496b91-6a5b-449b-b23d-bc86f43361eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.639432 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vpnv\" (UniqueName: \"kubernetes.io/projected/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-kube-api-access-8vpnv\") pod \"redhat-marketplace-6fxvm\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.716396 4794 patch_prober.go:28] interesting pod/router-default-5444994796-wd9tc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 15 13:56:36 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Dec 15 13:56:36 crc kubenswrapper[4794]: [+]process-running ok Dec 15 13:56:36 crc kubenswrapper[4794]: healthz check failed Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.716451 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wd9tc" podUID="06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.737080 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvh"] Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.742518 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-utilities\") pod \"redhat-marketplace-6fxvm\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.742606 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d496b91-6a5b-449b-b23d-bc86f43361eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3d496b91-6a5b-449b-b23d-bc86f43361eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.742633 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-catalog-content\") pod \"redhat-marketplace-6fxvm\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.742666 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d496b91-6a5b-449b-b23d-bc86f43361eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3d496b91-6a5b-449b-b23d-bc86f43361eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.742785 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vpnv\" (UniqueName: \"kubernetes.io/projected/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-kube-api-access-8vpnv\") pod \"redhat-marketplace-6fxvm\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.744832 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-utilities\") pod \"redhat-marketplace-6fxvm\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.744897 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d496b91-6a5b-449b-b23d-bc86f43361eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3d496b91-6a5b-449b-b23d-bc86f43361eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.745191 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-catalog-content\") pod \"redhat-marketplace-6fxvm\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.760549 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.767222 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vpnv\" (UniqueName: \"kubernetes.io/projected/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-kube-api-access-8vpnv\") pod \"redhat-marketplace-6fxvm\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.770141 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d496b91-6a5b-449b-b23d-bc86f43361eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3d496b91-6a5b-449b-b23d-bc86f43361eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.876399 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:56:36 crc kubenswrapper[4794]: I1215 13:56:36.902871 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.044726 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fxvm"] Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.372128 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-snr54"] Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.376307 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.380441 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.392763 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snr54"] Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.398170 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.399338 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.401831 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.402253 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.461101 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.464361 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sstwn\" (UniqueName: \"kubernetes.io/projected/e7c17057-4b6f-40ed-8901-b8e6249317ed-kube-api-access-sstwn\") pod \"redhat-operators-snr54\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.464413 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-utilities\") pod \"redhat-operators-snr54\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.464454 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.464559 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.464595 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-catalog-content\") pod \"redhat-operators-snr54\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.565409 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.565556 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.565613 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-catalog-content\") pod \"redhat-operators-snr54\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.565632 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.565659 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sstwn\" (UniqueName: \"kubernetes.io/projected/e7c17057-4b6f-40ed-8901-b8e6249317ed-kube-api-access-sstwn\") pod \"redhat-operators-snr54\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.565787 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-utilities\") pod \"redhat-operators-snr54\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.566212 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-catalog-content\") pod \"redhat-operators-snr54\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.566479 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-utilities\") pod \"redhat-operators-snr54\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.589443 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sstwn\" (UniqueName: \"kubernetes.io/projected/e7c17057-4b6f-40ed-8901-b8e6249317ed-kube-api-access-sstwn\") pod \"redhat-operators-snr54\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.591267 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.657818 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.657865 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.659330 4794 patch_prober.go:28] interesting pod/console-f9d7485db-rfx4t container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.659372 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rfx4t" podUID="aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.713895 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.717518 4794 patch_prober.go:28] interesting pod/router-default-5444994796-wd9tc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 15 13:56:37 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Dec 15 13:56:37 crc kubenswrapper[4794]: [+]process-running ok Dec 15 13:56:37 crc kubenswrapper[4794]: healthz check failed Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.717559 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wd9tc" podUID="06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.767812 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zwlx2"] Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.769574 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.782433 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.782869 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwlx2"] Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.790743 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.869502 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-utilities\") pod \"redhat-operators-zwlx2\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.869551 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-catalog-content\") pod \"redhat-operators-zwlx2\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.869708 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzqnr\" (UniqueName: \"kubernetes.io/projected/480d7d96-d156-4f7c-8db0-11a23c26149b-kube-api-access-nzqnr\") pod \"redhat-operators-zwlx2\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.922538 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvh" event={"ID":"9f11449c-f57e-46af-bd65-ed450d85ba60","Type":"ContainerStarted","Data":"b0cf56f05e21975c80242bda5f3cceb9ebb8824380ec4f907918cb2fa0f1a899"} Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.923203 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.945892 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" podStartSLOduration=138.945873191 podStartE2EDuration="2m18.945873191s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:37.942365667 +0000 UTC m=+159.794388145" watchObservedRunningTime="2025-12-15 13:56:37.945873191 +0000 UTC m=+159.797895639" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.956416 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5m27d" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.960842 4794 patch_prober.go:28] interesting pod/downloads-7954f5f757-5m27d container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.960979 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5m27d" podUID="6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.960867 4794 patch_prober.go:28] interesting pod/downloads-7954f5f757-5m27d container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.961965 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5m27d" podUID="6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.962531 4794 patch_prober.go:28] interesting pod/downloads-7954f5f757-5m27d container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.962625 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5m27d" podUID="6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.970960 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzqnr\" (UniqueName: \"kubernetes.io/projected/480d7d96-d156-4f7c-8db0-11a23c26149b-kube-api-access-nzqnr\") pod \"redhat-operators-zwlx2\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.971258 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-utilities\") pod \"redhat-operators-zwlx2\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.971408 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-catalog-content\") pod \"redhat-operators-zwlx2\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.979878 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-catalog-content\") pod \"redhat-operators-zwlx2\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.979929 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-utilities\") pod \"redhat-operators-zwlx2\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:56:37 crc kubenswrapper[4794]: I1215 13:56:37.998944 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzqnr\" (UniqueName: \"kubernetes.io/projected/480d7d96-d156-4f7c-8db0-11a23c26149b-kube-api-access-nzqnr\") pod \"redhat-operators-zwlx2\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.086971 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.322697 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwlx2"] Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.419842 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snr54"] Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.442773 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.446868 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.717679 4794 patch_prober.go:28] interesting pod/router-default-5444994796-wd9tc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 15 13:56:38 crc kubenswrapper[4794]: [-]has-synced failed: reason withheld Dec 15 13:56:38 crc kubenswrapper[4794]: [+]process-running ok Dec 15 13:56:38 crc kubenswrapper[4794]: healthz check failed Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.717786 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wd9tc" podUID="06366fe3-a2f0-4ff6-b2ec-61d1d9a736e7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.926774 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f","Type":"ContainerStarted","Data":"296215fa4c2a669f66a3472f0c6283fcd74c163e4f31ec8d212675277c41c323"} Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.927621 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwlx2" event={"ID":"480d7d96-d156-4f7c-8db0-11a23c26149b","Type":"ContainerStarted","Data":"e7ca91624d52c893a6cf62d7edfe8dab80eb9235b7ead17dbdb24cf7069a469f"} Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.928462 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snr54" event={"ID":"e7c17057-4b6f-40ed-8901-b8e6249317ed","Type":"ContainerStarted","Data":"c08f915e25b892acb13d839c5f74b8d45ad8aedb32b3d505fdc75a268519f881"} Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.929300 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fxvm" event={"ID":"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b","Type":"ContainerStarted","Data":"5f01efbb8fa7cf9a7944cf5c43adcbba2bea609d437c28e2001a1817156aa0dc"} Dec 15 13:56:38 crc kubenswrapper[4794]: I1215 13:56:38.930662 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d496b91-6a5b-449b-b23d-bc86f43361eb","Type":"ContainerStarted","Data":"c82e75b13e2d48cc381caec0260f8136032e3a29e495241550f073554315e9fd"} Dec 15 13:56:39 crc kubenswrapper[4794]: I1215 13:56:39.717143 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:39 crc kubenswrapper[4794]: I1215 13:56:39.719415 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wd9tc" Dec 15 13:56:39 crc kubenswrapper[4794]: I1215 13:56:39.952140 4794 generic.go:334] "Generic (PLEG): container finished" podID="9f11449c-f57e-46af-bd65-ed450d85ba60" containerID="cfa5656dd220f2e750b1fb30e56de044fd8835c0dc1a3a471eeda191dc3d2cc6" exitCode=0 Dec 15 13:56:39 crc kubenswrapper[4794]: I1215 13:56:39.952283 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvh" event={"ID":"9f11449c-f57e-46af-bd65-ed450d85ba60","Type":"ContainerDied","Data":"cfa5656dd220f2e750b1fb30e56de044fd8835c0dc1a3a471eeda191dc3d2cc6"} Dec 15 13:56:39 crc kubenswrapper[4794]: I1215 13:56:39.990773 4794 generic.go:334] "Generic (PLEG): container finished" podID="e7c17057-4b6f-40ed-8901-b8e6249317ed" containerID="0c07d7edcd6b9a234a6a380d5f4d430346b8630f367c5f94e3cee5bc6ca41536" exitCode=0 Dec 15 13:56:39 crc kubenswrapper[4794]: I1215 13:56:39.990867 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snr54" event={"ID":"e7c17057-4b6f-40ed-8901-b8e6249317ed","Type":"ContainerDied","Data":"0c07d7edcd6b9a234a6a380d5f4d430346b8630f367c5f94e3cee5bc6ca41536"} Dec 15 13:56:40 crc kubenswrapper[4794]: I1215 13:56:39.998754 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d496b91-6a5b-449b-b23d-bc86f43361eb","Type":"ContainerStarted","Data":"71ca3d6c7c2445af638e2eef7b4511a538bc8abe98048fc03ce58fc9f8b59e65"} Dec 15 13:56:40 crc kubenswrapper[4794]: I1215 13:56:40.001491 4794 generic.go:334] "Generic (PLEG): container finished" podID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" containerID="241704ec353f3006e2cc0eb4d7fd57828e77c97b2e0fe662ff8df1448d89d22c" exitCode=0 Dec 15 13:56:40 crc kubenswrapper[4794]: I1215 13:56:40.001626 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fxvm" event={"ID":"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b","Type":"ContainerDied","Data":"241704ec353f3006e2cc0eb4d7fd57828e77c97b2e0fe662ff8df1448d89d22c"} Dec 15 13:56:40 crc kubenswrapper[4794]: I1215 13:56:40.006059 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f","Type":"ContainerStarted","Data":"520090f1d4b9e016b4b3865c01deff89169c2c0f9f002c216f658e316e0c9869"} Dec 15 13:56:40 crc kubenswrapper[4794]: I1215 13:56:40.012704 4794 generic.go:334] "Generic (PLEG): container finished" podID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerID="509f98ad9fb2d82916f8bdcf7c893ca5f21beb05a75a7adba2cef3f871eb4c5b" exitCode=0 Dec 15 13:56:40 crc kubenswrapper[4794]: I1215 13:56:40.013520 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwlx2" event={"ID":"480d7d96-d156-4f7c-8db0-11a23c26149b","Type":"ContainerDied","Data":"509f98ad9fb2d82916f8bdcf7c893ca5f21beb05a75a7adba2cef3f871eb4c5b"} Dec 15 13:56:40 crc kubenswrapper[4794]: I1215 13:56:40.049706 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 13:56:40 crc kubenswrapper[4794]: I1215 13:56:40.064187 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.064171221 podStartE2EDuration="4.064171221s" podCreationTimestamp="2025-12-15 13:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:40.061473259 +0000 UTC m=+161.913495697" watchObservedRunningTime="2025-12-15 13:56:40.064171221 +0000 UTC m=+161.916193659" Dec 15 13:56:40 crc kubenswrapper[4794]: I1215 13:56:40.088948 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.088912506 podStartE2EDuration="3.088912506s" podCreationTimestamp="2025-12-15 13:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:56:40.087715434 +0000 UTC m=+161.939737872" watchObservedRunningTime="2025-12-15 13:56:40.088912506 +0000 UTC m=+161.940934944" Dec 15 13:56:41 crc kubenswrapper[4794]: I1215 13:56:41.025195 4794 generic.go:334] "Generic (PLEG): container finished" podID="3d496b91-6a5b-449b-b23d-bc86f43361eb" containerID="71ca3d6c7c2445af638e2eef7b4511a538bc8abe98048fc03ce58fc9f8b59e65" exitCode=0 Dec 15 13:56:41 crc kubenswrapper[4794]: I1215 13:56:41.025411 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d496b91-6a5b-449b-b23d-bc86f43361eb","Type":"ContainerDied","Data":"71ca3d6c7c2445af638e2eef7b4511a538bc8abe98048fc03ce58fc9f8b59e65"} Dec 15 13:56:41 crc kubenswrapper[4794]: I1215 13:56:41.029280 4794 generic.go:334] "Generic (PLEG): container finished" podID="de7e9ce3-d34e-4394-ba2d-aed65da5dc0f" containerID="520090f1d4b9e016b4b3865c01deff89169c2c0f9f002c216f658e316e0c9869" exitCode=0 Dec 15 13:56:41 crc kubenswrapper[4794]: I1215 13:56:41.029319 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f","Type":"ContainerDied","Data":"520090f1d4b9e016b4b3865c01deff89169c2c0f9f002c216f658e316e0c9869"} Dec 15 13:56:41 crc kubenswrapper[4794]: I1215 13:56:41.928519 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:41 crc kubenswrapper[4794]: I1215 13:56:41.934061 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f6d5d8e-1512-4d71-8363-ba6003bf10b6-metrics-certs\") pod \"network-metrics-daemon-4xt6f\" (UID: \"1f6d5d8e-1512-4d71-8363-ba6003bf10b6\") " pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.035521 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xt6f" Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.454542 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.456303 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.463448 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4xt6f"] Dec 15 13:56:42 crc kubenswrapper[4794]: W1215 13:56:42.481422 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f6d5d8e_1512_4d71_8363_ba6003bf10b6.slice/crio-4a26a82b9dc87aa0b5d5c4a7adfef5bfc59bd79c97ae0873800e4af70e2ba1e7 WatchSource:0}: Error finding container 4a26a82b9dc87aa0b5d5c4a7adfef5bfc59bd79c97ae0873800e4af70e2ba1e7: Status 404 returned error can't find the container with id 4a26a82b9dc87aa0b5d5c4a7adfef5bfc59bd79c97ae0873800e4af70e2ba1e7 Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.639188 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d496b91-6a5b-449b-b23d-bc86f43361eb-kube-api-access\") pod \"3d496b91-6a5b-449b-b23d-bc86f43361eb\" (UID: \"3d496b91-6a5b-449b-b23d-bc86f43361eb\") " Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.639266 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d496b91-6a5b-449b-b23d-bc86f43361eb-kubelet-dir\") pod \"3d496b91-6a5b-449b-b23d-bc86f43361eb\" (UID: \"3d496b91-6a5b-449b-b23d-bc86f43361eb\") " Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.639301 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kubelet-dir\") pod \"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f\" (UID: \"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f\") " Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.639382 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kube-api-access\") pod \"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f\" (UID: \"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f\") " Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.642660 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d496b91-6a5b-449b-b23d-bc86f43361eb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3d496b91-6a5b-449b-b23d-bc86f43361eb" (UID: "3d496b91-6a5b-449b-b23d-bc86f43361eb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.642682 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "de7e9ce3-d34e-4394-ba2d-aed65da5dc0f" (UID: "de7e9ce3-d34e-4394-ba2d-aed65da5dc0f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.644524 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d496b91-6a5b-449b-b23d-bc86f43361eb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3d496b91-6a5b-449b-b23d-bc86f43361eb" (UID: "3d496b91-6a5b-449b-b23d-bc86f43361eb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.644931 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "de7e9ce3-d34e-4394-ba2d-aed65da5dc0f" (UID: "de7e9ce3-d34e-4394-ba2d-aed65da5dc0f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.741189 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.741235 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d496b91-6a5b-449b-b23d-bc86f43361eb-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.741247 4794 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d496b91-6a5b-449b-b23d-bc86f43361eb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 15 13:56:42 crc kubenswrapper[4794]: I1215 13:56:42.741259 4794 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de7e9ce3-d34e-4394-ba2d-aed65da5dc0f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 15 13:56:43 crc kubenswrapper[4794]: I1215 13:56:43.046155 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 15 13:56:43 crc kubenswrapper[4794]: I1215 13:56:43.046626 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"de7e9ce3-d34e-4394-ba2d-aed65da5dc0f","Type":"ContainerDied","Data":"296215fa4c2a669f66a3472f0c6283fcd74c163e4f31ec8d212675277c41c323"} Dec 15 13:56:43 crc kubenswrapper[4794]: I1215 13:56:43.046647 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="296215fa4c2a669f66a3472f0c6283fcd74c163e4f31ec8d212675277c41c323" Dec 15 13:56:43 crc kubenswrapper[4794]: I1215 13:56:43.050885 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" event={"ID":"1f6d5d8e-1512-4d71-8363-ba6003bf10b6","Type":"ContainerStarted","Data":"6d44c899abcc6e01ac3321825ea1730c1d1a11028ee24a9178a7e2d8eeaa4011"} Dec 15 13:56:43 crc kubenswrapper[4794]: I1215 13:56:43.050907 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" event={"ID":"1f6d5d8e-1512-4d71-8363-ba6003bf10b6","Type":"ContainerStarted","Data":"4a26a82b9dc87aa0b5d5c4a7adfef5bfc59bd79c97ae0873800e4af70e2ba1e7"} Dec 15 13:56:43 crc kubenswrapper[4794]: I1215 13:56:43.058638 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d496b91-6a5b-449b-b23d-bc86f43361eb","Type":"ContainerDied","Data":"c82e75b13e2d48cc381caec0260f8136032e3a29e495241550f073554315e9fd"} Dec 15 13:56:43 crc kubenswrapper[4794]: I1215 13:56:43.058658 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c82e75b13e2d48cc381caec0260f8136032e3a29e495241550f073554315e9fd" Dec 15 13:56:43 crc kubenswrapper[4794]: I1215 13:56:43.058757 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 15 13:56:43 crc kubenswrapper[4794]: I1215 13:56:43.067958 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kpl54" Dec 15 13:56:47 crc kubenswrapper[4794]: I1215 13:56:47.669067 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:47 crc kubenswrapper[4794]: I1215 13:56:47.679758 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 13:56:47 crc kubenswrapper[4794]: I1215 13:56:47.955151 4794 patch_prober.go:28] interesting pod/downloads-7954f5f757-5m27d container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 15 13:56:47 crc kubenswrapper[4794]: I1215 13:56:47.955207 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5m27d" podUID="6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 15 13:56:47 crc kubenswrapper[4794]: I1215 13:56:47.955652 4794 patch_prober.go:28] interesting pod/downloads-7954f5f757-5m27d container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 15 13:56:47 crc kubenswrapper[4794]: I1215 13:56:47.955673 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5m27d" podUID="6a4d35d6-30d1-4da3-8ecc-2112b46b2b5d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 15 13:56:49 crc kubenswrapper[4794]: I1215 13:56:49.277558 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqpgr"] Dec 15 13:56:49 crc kubenswrapper[4794]: I1215 13:56:49.277850 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" podUID="6fa75386-467c-467e-8ef7-27c65cd6a2b5" containerName="controller-manager" containerID="cri-o://4c5e71643eba6e68b8d8fb74163c30bf12c851b026e3b6c043de0575ea3d3ea6" gracePeriod=30 Dec 15 13:56:49 crc kubenswrapper[4794]: I1215 13:56:49.297937 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb"] Dec 15 13:56:49 crc kubenswrapper[4794]: I1215 13:56:49.298288 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" podUID="23e4e919-2929-484f-afe8-e80ddc566e7c" containerName="route-controller-manager" containerID="cri-o://ea63348063b5827a95cf5eb6936939fa93531af90af599752f2e7de48dea2ecf" gracePeriod=30 Dec 15 13:56:51 crc kubenswrapper[4794]: I1215 13:56:51.104312 4794 generic.go:334] "Generic (PLEG): container finished" podID="23e4e919-2929-484f-afe8-e80ddc566e7c" containerID="ea63348063b5827a95cf5eb6936939fa93531af90af599752f2e7de48dea2ecf" exitCode=0 Dec 15 13:56:51 crc kubenswrapper[4794]: I1215 13:56:51.104434 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" event={"ID":"23e4e919-2929-484f-afe8-e80ddc566e7c","Type":"ContainerDied","Data":"ea63348063b5827a95cf5eb6936939fa93531af90af599752f2e7de48dea2ecf"} Dec 15 13:56:51 crc kubenswrapper[4794]: I1215 13:56:51.107225 4794 generic.go:334] "Generic (PLEG): container finished" podID="6fa75386-467c-467e-8ef7-27c65cd6a2b5" containerID="4c5e71643eba6e68b8d8fb74163c30bf12c851b026e3b6c043de0575ea3d3ea6" exitCode=0 Dec 15 13:56:51 crc kubenswrapper[4794]: I1215 13:56:51.107276 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" event={"ID":"6fa75386-467c-467e-8ef7-27c65cd6a2b5","Type":"ContainerDied","Data":"4c5e71643eba6e68b8d8fb74163c30bf12c851b026e3b6c043de0575ea3d3ea6"} Dec 15 13:56:54 crc kubenswrapper[4794]: I1215 13:56:54.540264 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 13:56:54 crc kubenswrapper[4794]: I1215 13:56:54.540351 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 13:56:55 crc kubenswrapper[4794]: I1215 13:56:55.198035 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 13:56:56 crc kubenswrapper[4794]: I1215 13:56:56.534980 4794 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-824mb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 15 13:56:56 crc kubenswrapper[4794]: I1215 13:56:56.535064 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" podUID="23e4e919-2929-484f-afe8-e80ddc566e7c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 15 13:56:57 crc kubenswrapper[4794]: I1215 13:56:57.253766 4794 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cqpgr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 15 13:56:57 crc kubenswrapper[4794]: I1215 13:56:57.253916 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" podUID="6fa75386-467c-467e-8ef7-27c65cd6a2b5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 15 13:56:57 crc kubenswrapper[4794]: I1215 13:56:57.983212 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5m27d" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.865849 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.951345 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa75386-467c-467e-8ef7-27c65cd6a2b5-serving-cert\") pod \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.951486 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-client-ca\") pod \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.951568 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjqnk\" (UniqueName: \"kubernetes.io/projected/6fa75386-467c-467e-8ef7-27c65cd6a2b5-kube-api-access-bjqnk\") pod \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.951623 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-proxy-ca-bundles\") pod \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.951680 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-config\") pod \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\" (UID: \"6fa75386-467c-467e-8ef7-27c65cd6a2b5\") " Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.952493 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-client-ca" (OuterVolumeSpecName: "client-ca") pod "6fa75386-467c-467e-8ef7-27c65cd6a2b5" (UID: "6fa75386-467c-467e-8ef7-27c65cd6a2b5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.952724 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-config" (OuterVolumeSpecName: "config") pod "6fa75386-467c-467e-8ef7-27c65cd6a2b5" (UID: "6fa75386-467c-467e-8ef7-27c65cd6a2b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.952945 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.952964 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.953460 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6fa75386-467c-467e-8ef7-27c65cd6a2b5" (UID: "6fa75386-467c-467e-8ef7-27c65cd6a2b5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.959085 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9"] Dec 15 13:57:01 crc kubenswrapper[4794]: E1215 13:57:01.959346 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d496b91-6a5b-449b-b23d-bc86f43361eb" containerName="pruner" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.959361 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d496b91-6a5b-449b-b23d-bc86f43361eb" containerName="pruner" Dec 15 13:57:01 crc kubenswrapper[4794]: E1215 13:57:01.959373 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7e9ce3-d34e-4394-ba2d-aed65da5dc0f" containerName="pruner" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.959381 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7e9ce3-d34e-4394-ba2d-aed65da5dc0f" containerName="pruner" Dec 15 13:57:01 crc kubenswrapper[4794]: E1215 13:57:01.959393 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa75386-467c-467e-8ef7-27c65cd6a2b5" containerName="controller-manager" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.959403 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa75386-467c-467e-8ef7-27c65cd6a2b5" containerName="controller-manager" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.959623 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa75386-467c-467e-8ef7-27c65cd6a2b5" containerName="controller-manager" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.963727 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7e9ce3-d34e-4394-ba2d-aed65da5dc0f" containerName="pruner" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.963757 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d496b91-6a5b-449b-b23d-bc86f43361eb" containerName="pruner" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.964289 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.968762 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9"] Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.976721 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa75386-467c-467e-8ef7-27c65cd6a2b5-kube-api-access-bjqnk" (OuterVolumeSpecName: "kube-api-access-bjqnk") pod "6fa75386-467c-467e-8ef7-27c65cd6a2b5" (UID: "6fa75386-467c-467e-8ef7-27c65cd6a2b5"). InnerVolumeSpecName "kube-api-access-bjqnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:57:01 crc kubenswrapper[4794]: I1215 13:57:01.977596 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa75386-467c-467e-8ef7-27c65cd6a2b5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6fa75386-467c-467e-8ef7-27c65cd6a2b5" (UID: "6fa75386-467c-467e-8ef7-27c65cd6a2b5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.053708 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmf24\" (UniqueName: \"kubernetes.io/projected/16150433-db11-4dd4-bab9-1806f975a99b-kube-api-access-rmf24\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.053796 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-proxy-ca-bundles\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.053850 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16150433-db11-4dd4-bab9-1806f975a99b-serving-cert\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.053899 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-client-ca\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.054029 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-config\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.054423 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa75386-467c-467e-8ef7-27c65cd6a2b5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.054486 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjqnk\" (UniqueName: \"kubernetes.io/projected/6fa75386-467c-467e-8ef7-27c65cd6a2b5-kube-api-access-bjqnk\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.054507 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6fa75386-467c-467e-8ef7-27c65cd6a2b5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.155003 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmf24\" (UniqueName: \"kubernetes.io/projected/16150433-db11-4dd4-bab9-1806f975a99b-kube-api-access-rmf24\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.155045 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-proxy-ca-bundles\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.155064 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16150433-db11-4dd4-bab9-1806f975a99b-serving-cert\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.155096 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-client-ca\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.155139 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-config\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.156273 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-client-ca\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.156385 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-proxy-ca-bundles\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.156671 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-config\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.165417 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16150433-db11-4dd4-bab9-1806f975a99b-serving-cert\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.173068 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmf24\" (UniqueName: \"kubernetes.io/projected/16150433-db11-4dd4-bab9-1806f975a99b-kube-api-access-rmf24\") pod \"controller-manager-79c8ff59f4-5gdk9\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.187504 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" event={"ID":"6fa75386-467c-467e-8ef7-27c65cd6a2b5","Type":"ContainerDied","Data":"fcc4c989808b6d646710b4b37cf36b67e7746f75d8fa18b86532d34b76e193a2"} Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.187609 4794 scope.go:117] "RemoveContainer" containerID="4c5e71643eba6e68b8d8fb74163c30bf12c851b026e3b6c043de0575ea3d3ea6" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.187770 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cqpgr" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.233786 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqpgr"] Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.237316 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cqpgr"] Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.301700 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:02 crc kubenswrapper[4794]: I1215 13:57:02.748874 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fa75386-467c-467e-8ef7-27c65cd6a2b5" path="/var/lib/kubelet/pods/6fa75386-467c-467e-8ef7-27c65cd6a2b5/volumes" Dec 15 13:57:07 crc kubenswrapper[4794]: I1215 13:57:07.534229 4794 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-824mb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 15 13:57:07 crc kubenswrapper[4794]: I1215 13:57:07.536106 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" podUID="23e4e919-2929-484f-afe8-e80ddc566e7c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 15 13:57:08 crc kubenswrapper[4794]: I1215 13:57:08.096100 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dsm7q" Dec 15 13:57:09 crc kubenswrapper[4794]: I1215 13:57:09.307150 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9"] Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.387085 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.388133 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.392637 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.393384 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.428411 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.499278 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67541105-959c-43f2-afe5-672f43498311-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"67541105-959c-43f2-afe5-672f43498311\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.499376 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67541105-959c-43f2-afe5-672f43498311-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"67541105-959c-43f2-afe5-672f43498311\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.601799 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67541105-959c-43f2-afe5-672f43498311-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"67541105-959c-43f2-afe5-672f43498311\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.602262 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67541105-959c-43f2-afe5-672f43498311-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"67541105-959c-43f2-afe5-672f43498311\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.602369 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67541105-959c-43f2-afe5-672f43498311-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"67541105-959c-43f2-afe5-672f43498311\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.628129 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67541105-959c-43f2-afe5-672f43498311-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"67541105-959c-43f2-afe5-672f43498311\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 15 13:57:12 crc kubenswrapper[4794]: I1215 13:57:12.734012 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.260191 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" event={"ID":"23e4e919-2929-484f-afe8-e80ddc566e7c","Type":"ContainerDied","Data":"bd7235065177f7298a6ef0af23b253891bdbc3dc161b5e6bc955576642eed1a8"} Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.260237 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd7235065177f7298a6ef0af23b253891bdbc3dc161b5e6bc955576642eed1a8" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.297507 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.311715 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23e4e919-2929-484f-afe8-e80ddc566e7c-serving-cert\") pod \"23e4e919-2929-484f-afe8-e80ddc566e7c\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.311872 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-client-ca\") pod \"23e4e919-2929-484f-afe8-e80ddc566e7c\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.311966 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt2pn\" (UniqueName: \"kubernetes.io/projected/23e4e919-2929-484f-afe8-e80ddc566e7c-kube-api-access-vt2pn\") pod \"23e4e919-2929-484f-afe8-e80ddc566e7c\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.312073 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-config\") pod \"23e4e919-2929-484f-afe8-e80ddc566e7c\" (UID: \"23e4e919-2929-484f-afe8-e80ddc566e7c\") " Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.313313 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-client-ca" (OuterVolumeSpecName: "client-ca") pod "23e4e919-2929-484f-afe8-e80ddc566e7c" (UID: "23e4e919-2929-484f-afe8-e80ddc566e7c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.313513 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-config" (OuterVolumeSpecName: "config") pod "23e4e919-2929-484f-afe8-e80ddc566e7c" (UID: "23e4e919-2929-484f-afe8-e80ddc566e7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.325340 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e4e919-2929-484f-afe8-e80ddc566e7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "23e4e919-2929-484f-afe8-e80ddc566e7c" (UID: "23e4e919-2929-484f-afe8-e80ddc566e7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.327730 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e4e919-2929-484f-afe8-e80ddc566e7c-kube-api-access-vt2pn" (OuterVolumeSpecName: "kube-api-access-vt2pn") pod "23e4e919-2929-484f-afe8-e80ddc566e7c" (UID: "23e4e919-2929-484f-afe8-e80ddc566e7c"). InnerVolumeSpecName "kube-api-access-vt2pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.328413 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm"] Dec 15 13:57:13 crc kubenswrapper[4794]: E1215 13:57:13.332890 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e4e919-2929-484f-afe8-e80ddc566e7c" containerName="route-controller-manager" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.332931 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e4e919-2929-484f-afe8-e80ddc566e7c" containerName="route-controller-manager" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.333206 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e4e919-2929-484f-afe8-e80ddc566e7c" containerName="route-controller-manager" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.333896 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.344066 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm"] Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.413763 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-config\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.413838 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-client-ca\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.413887 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67c2996f-5b16-4d51-aeaa-004b908e3dde-serving-cert\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.413934 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgnpm\" (UniqueName: \"kubernetes.io/projected/67c2996f-5b16-4d51-aeaa-004b908e3dde-kube-api-access-jgnpm\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.414136 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.414158 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23e4e919-2929-484f-afe8-e80ddc566e7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.414176 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23e4e919-2929-484f-afe8-e80ddc566e7c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.414196 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt2pn\" (UniqueName: \"kubernetes.io/projected/23e4e919-2929-484f-afe8-e80ddc566e7c-kube-api-access-vt2pn\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.515378 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-config\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.515474 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-client-ca\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.515565 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67c2996f-5b16-4d51-aeaa-004b908e3dde-serving-cert\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.515663 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgnpm\" (UniqueName: \"kubernetes.io/projected/67c2996f-5b16-4d51-aeaa-004b908e3dde-kube-api-access-jgnpm\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.517359 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-client-ca\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.517628 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-config\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.521652 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67c2996f-5b16-4d51-aeaa-004b908e3dde-serving-cert\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.545369 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgnpm\" (UniqueName: \"kubernetes.io/projected/67c2996f-5b16-4d51-aeaa-004b908e3dde-kube-api-access-jgnpm\") pod \"route-controller-manager-69b48b9cf5-sb6sm\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: I1215 13:57:13.676882 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:13 crc kubenswrapper[4794]: E1215 13:57:13.741829 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:e9bc35478da4e272fcc5e4573ebac9535075e1f2d8c613b985ef6e3a3c0c813e: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:e9bc35478da4e272fcc5e4573ebac9535075e1f2d8c613b985ef6e3a3c0c813e\": context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 15 13:57:13 crc kubenswrapper[4794]: E1215 13:57:13.742113 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sstwn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-snr54_openshift-marketplace(e7c17057-4b6f-40ed-8901-b8e6249317ed): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:e9bc35478da4e272fcc5e4573ebac9535075e1f2d8c613b985ef6e3a3c0c813e: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:e9bc35478da4e272fcc5e4573ebac9535075e1f2d8c613b985ef6e3a3c0c813e\": context canceled" logger="UnhandledError" Dec 15 13:57:13 crc kubenswrapper[4794]: E1215 13:57:13.743472 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:e9bc35478da4e272fcc5e4573ebac9535075e1f2d8c613b985ef6e3a3c0c813e: Get \\\"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:e9bc35478da4e272fcc5e4573ebac9535075e1f2d8c613b985ef6e3a3c0c813e\\\": context canceled\"" pod="openshift-marketplace/redhat-operators-snr54" podUID="e7c17057-4b6f-40ed-8901-b8e6249317ed" Dec 15 13:57:14 crc kubenswrapper[4794]: I1215 13:57:14.264043 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb" Dec 15 13:57:14 crc kubenswrapper[4794]: I1215 13:57:14.299025 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb"] Dec 15 13:57:14 crc kubenswrapper[4794]: I1215 13:57:14.301758 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-824mb"] Dec 15 13:57:14 crc kubenswrapper[4794]: I1215 13:57:14.742663 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e4e919-2929-484f-afe8-e80ddc566e7c" path="/var/lib/kubelet/pods/23e4e919-2929-484f-afe8-e80ddc566e7c/volumes" Dec 15 13:57:17 crc kubenswrapper[4794]: I1215 13:57:17.786043 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 15 13:57:17 crc kubenswrapper[4794]: I1215 13:57:17.786832 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:57:17 crc kubenswrapper[4794]: I1215 13:57:17.808324 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 15 13:57:17 crc kubenswrapper[4794]: I1215 13:57:17.981097 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-var-lock\") pod \"installer-9-crc\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:57:17 crc kubenswrapper[4794]: I1215 13:57:17.981160 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:57:17 crc kubenswrapper[4794]: I1215 13:57:17.981188 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45fd189c-2cde-48f0-8b4c-364e646fd81b-kube-api-access\") pod \"installer-9-crc\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:57:18 crc kubenswrapper[4794]: I1215 13:57:18.082166 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:57:18 crc kubenswrapper[4794]: I1215 13:57:18.082246 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45fd189c-2cde-48f0-8b4c-364e646fd81b-kube-api-access\") pod \"installer-9-crc\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:57:18 crc kubenswrapper[4794]: I1215 13:57:18.082335 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:57:18 crc kubenswrapper[4794]: I1215 13:57:18.083757 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-var-lock\") pod \"installer-9-crc\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:57:18 crc kubenswrapper[4794]: I1215 13:57:18.083858 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-var-lock\") pod \"installer-9-crc\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:57:18 crc kubenswrapper[4794]: I1215 13:57:18.109611 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45fd189c-2cde-48f0-8b4c-364e646fd81b-kube-api-access\") pod \"installer-9-crc\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:57:18 crc kubenswrapper[4794]: I1215 13:57:18.119432 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:57:24 crc kubenswrapper[4794]: I1215 13:57:24.534880 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 13:57:24 crc kubenswrapper[4794]: I1215 13:57:24.535541 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 13:57:24 crc kubenswrapper[4794]: I1215 13:57:24.535636 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 13:57:24 crc kubenswrapper[4794]: I1215 13:57:24.536352 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45"} pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 15 13:57:24 crc kubenswrapper[4794]: I1215 13:57:24.536519 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" containerID="cri-o://f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45" gracePeriod=600 Dec 15 13:57:34 crc kubenswrapper[4794]: E1215 13:57:34.127243 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 15 13:57:34 crc kubenswrapper[4794]: E1215 13:57:34.127990 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8vpnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6fxvm_openshift-marketplace(3b4751fc-ae26-4b2b-bab6-b99534e1ea1b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 15 13:57:34 crc kubenswrapper[4794]: E1215 13:57:34.129574 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6fxvm" podUID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" Dec 15 13:57:34 crc kubenswrapper[4794]: I1215 13:57:34.402387 4794 generic.go:334] "Generic (PLEG): container finished" podID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerID="f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45" exitCode=0 Dec 15 13:57:34 crc kubenswrapper[4794]: I1215 13:57:34.402521 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerDied","Data":"f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45"} Dec 15 13:57:34 crc kubenswrapper[4794]: E1215 13:57:34.477815 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6fxvm" podUID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" Dec 15 13:57:36 crc kubenswrapper[4794]: E1215 13:57:36.937876 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 15 13:57:36 crc kubenswrapper[4794]: E1215 13:57:36.938328 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-868n4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xb6h5_openshift-marketplace(f6610ca9-db3b-4fe8-80bb-9b35208e00b2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 15 13:57:36 crc kubenswrapper[4794]: E1215 13:57:36.939482 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-xb6h5" podUID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" Dec 15 13:57:39 crc kubenswrapper[4794]: E1215 13:57:39.985014 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 15 13:57:39 crc kubenswrapper[4794]: E1215 13:57:39.985170 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pnlzz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hj6qt_openshift-marketplace(e64ae040-f93d-4dcc-8dce-812b127b0630): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 15 13:57:39 crc kubenswrapper[4794]: E1215 13:57:39.986382 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hj6qt" podUID="e64ae040-f93d-4dcc-8dce-812b127b0630" Dec 15 13:57:41 crc kubenswrapper[4794]: E1215 13:57:41.479978 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xb6h5" podUID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" Dec 15 13:57:41 crc kubenswrapper[4794]: E1215 13:57:41.480028 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hj6qt" podUID="e64ae040-f93d-4dcc-8dce-812b127b0630" Dec 15 13:57:45 crc kubenswrapper[4794]: E1215 13:57:45.766855 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 15 13:57:45 crc kubenswrapper[4794]: E1215 13:57:45.767041 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nzqnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zwlx2_openshift-marketplace(480d7d96-d156-4f7c-8db0-11a23c26149b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 15 13:57:45 crc kubenswrapper[4794]: E1215 13:57:45.768233 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zwlx2" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" Dec 15 13:57:46 crc kubenswrapper[4794]: E1215 13:57:46.114114 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 15 13:57:46 crc kubenswrapper[4794]: E1215 13:57:46.114985 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2vm82,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5sjvh_openshift-marketplace(9f11449c-f57e-46af-bd65-ed450d85ba60): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 15 13:57:46 crc kubenswrapper[4794]: E1215 13:57:46.116268 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5sjvh" podUID="9f11449c-f57e-46af-bd65-ed450d85ba60" Dec 15 13:57:46 crc kubenswrapper[4794]: I1215 13:57:46.312873 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 15 13:57:46 crc kubenswrapper[4794]: E1215 13:57:46.563623 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 15 13:57:46 crc kubenswrapper[4794]: E1215 13:57:46.564092 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f8qm2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2lkl6_openshift-marketplace(4e69f94d-9c18-4b19-8290-5e2d86ab4bae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 15 13:57:46 crc kubenswrapper[4794]: E1215 13:57:46.565262 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2lkl6" podUID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" Dec 15 13:57:46 crc kubenswrapper[4794]: E1215 13:57:46.569429 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zwlx2" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" Dec 15 13:57:46 crc kubenswrapper[4794]: E1215 13:57:46.569517 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5sjvh" podUID="9f11449c-f57e-46af-bd65-ed450d85ba60" Dec 15 13:57:46 crc kubenswrapper[4794]: E1215 13:57:46.680976 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 15 13:57:46 crc kubenswrapper[4794]: E1215 13:57:46.681361 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fpsp7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-llmhn_openshift-marketplace(44b3d3fc-c9d0-4487-b176-26ef1f32de15): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 15 13:57:46 crc kubenswrapper[4794]: E1215 13:57:46.682707 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-llmhn" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" Dec 15 13:57:46 crc kubenswrapper[4794]: I1215 13:57:46.776025 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9"] Dec 15 13:57:46 crc kubenswrapper[4794]: W1215 13:57:46.781016 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16150433_db11_4dd4_bab9_1806f975a99b.slice/crio-4987e65c35762f22bb133c9bbc947c21bdbc6f354077df9d9eb0c85dc7d66c5a WatchSource:0}: Error finding container 4987e65c35762f22bb133c9bbc947c21bdbc6f354077df9d9eb0c85dc7d66c5a: Status 404 returned error can't find the container with id 4987e65c35762f22bb133c9bbc947c21bdbc6f354077df9d9eb0c85dc7d66c5a Dec 15 13:57:46 crc kubenswrapper[4794]: I1215 13:57:46.851272 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm"] Dec 15 13:57:46 crc kubenswrapper[4794]: W1215 13:57:46.862209 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c2996f_5b16_4d51_aeaa_004b908e3dde.slice/crio-342296ae658b4242d906063e1307a375afcb70db09a370e03995886487ba84f6 WatchSource:0}: Error finding container 342296ae658b4242d906063e1307a375afcb70db09a370e03995886487ba84f6: Status 404 returned error can't find the container with id 342296ae658b4242d906063e1307a375afcb70db09a370e03995886487ba84f6 Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.020979 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.510858 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" event={"ID":"16150433-db11-4dd4-bab9-1806f975a99b","Type":"ContainerStarted","Data":"edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa"} Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.511364 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" event={"ID":"16150433-db11-4dd4-bab9-1806f975a99b","Type":"ContainerStarted","Data":"4987e65c35762f22bb133c9bbc947c21bdbc6f354077df9d9eb0c85dc7d66c5a"} Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.511468 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" podUID="16150433-db11-4dd4-bab9-1806f975a99b" containerName="controller-manager" containerID="cri-o://edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa" gracePeriod=30 Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.512000 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.514338 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" event={"ID":"67c2996f-5b16-4d51-aeaa-004b908e3dde","Type":"ContainerStarted","Data":"a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7"} Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.514389 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" event={"ID":"67c2996f-5b16-4d51-aeaa-004b908e3dde","Type":"ContainerStarted","Data":"342296ae658b4242d906063e1307a375afcb70db09a370e03995886487ba84f6"} Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.514807 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.516124 4794 generic.go:334] "Generic (PLEG): container finished" podID="e7c17057-4b6f-40ed-8901-b8e6249317ed" containerID="58fa4436659564f4c2ab19d4f860a1d79a0ba96dd469987464eb0f9655292cb1" exitCode=0 Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.516373 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snr54" event={"ID":"e7c17057-4b6f-40ed-8901-b8e6249317ed","Type":"ContainerDied","Data":"58fa4436659564f4c2ab19d4f860a1d79a0ba96dd469987464eb0f9655292cb1"} Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.523941 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"67541105-959c-43f2-afe5-672f43498311","Type":"ContainerStarted","Data":"306edf36061fa388b100d7b3402ae38a06fdda840cca880ed44bc76607073f62"} Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.523994 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"67541105-959c-43f2-afe5-672f43498311","Type":"ContainerStarted","Data":"c4486716d34c470099f459d9e96fdd10560abdf4ef8c577a90d601bfde17d2dc"} Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.524823 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.524894 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.541232 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" podStartSLOduration=58.541207305 podStartE2EDuration="58.541207305s" podCreationTimestamp="2025-12-15 13:56:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:57:47.526992335 +0000 UTC m=+229.379014813" watchObservedRunningTime="2025-12-15 13:57:47.541207305 +0000 UTC m=+229.393229743" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.545064 4794 generic.go:334] "Generic (PLEG): container finished" podID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" containerID="35fcf9c3611569153696420a654f30a1a3b4dab0ee283e36fd54d01a1113018b" exitCode=0 Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.545143 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fxvm" event={"ID":"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b","Type":"ContainerDied","Data":"35fcf9c3611569153696420a654f30a1a3b4dab0ee283e36fd54d01a1113018b"} Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.554095 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45fd189c-2cde-48f0-8b4c-364e646fd81b","Type":"ContainerStarted","Data":"88645fb7f59d0103425f135abd17a12fb1612e8769f2f2ec76723e45d0b2581a"} Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.557609 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"153c3cfa232a964aa700c0cbd7b85f063a70242545eaeec076923fd5ffce712a"} Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.568811 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4xt6f" event={"ID":"1f6d5d8e-1512-4d71-8363-ba6003bf10b6","Type":"ContainerStarted","Data":"e5aacae4c7a3923214887594312c90f1f7e914cf08185fe1e62c763ae8382c17"} Dec 15 13:57:47 crc kubenswrapper[4794]: E1215 13:57:47.569258 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-llmhn" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" Dec 15 13:57:47 crc kubenswrapper[4794]: E1215 13:57:47.573091 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2lkl6" podUID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.574234 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=35.574215324 podStartE2EDuration="35.574215324s" podCreationTimestamp="2025-12-15 13:57:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:57:47.564940663 +0000 UTC m=+229.416963101" watchObservedRunningTime="2025-12-15 13:57:47.574215324 +0000 UTC m=+229.426237762" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.580031 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" podStartSLOduration=38.580011577 podStartE2EDuration="38.580011577s" podCreationTimestamp="2025-12-15 13:57:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:57:47.554288573 +0000 UTC m=+229.406311011" watchObservedRunningTime="2025-12-15 13:57:47.580011577 +0000 UTC m=+229.432034015" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.672969 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4xt6f" podStartSLOduration=208.672952021 podStartE2EDuration="3m28.672952021s" podCreationTimestamp="2025-12-15 13:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:57:47.654520073 +0000 UTC m=+229.506542511" watchObservedRunningTime="2025-12-15 13:57:47.672952021 +0000 UTC m=+229.524974459" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.890179 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.924311 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5756b84b69-dskm5"] Dec 15 13:57:47 crc kubenswrapper[4794]: E1215 13:57:47.924554 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16150433-db11-4dd4-bab9-1806f975a99b" containerName="controller-manager" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.924568 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="16150433-db11-4dd4-bab9-1806f975a99b" containerName="controller-manager" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.924749 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="16150433-db11-4dd4-bab9-1806f975a99b" containerName="controller-manager" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.925175 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.927330 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5756b84b69-dskm5"] Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.969128 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-client-ca\") pod \"16150433-db11-4dd4-bab9-1806f975a99b\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.969254 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmf24\" (UniqueName: \"kubernetes.io/projected/16150433-db11-4dd4-bab9-1806f975a99b-kube-api-access-rmf24\") pod \"16150433-db11-4dd4-bab9-1806f975a99b\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.969300 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16150433-db11-4dd4-bab9-1806f975a99b-serving-cert\") pod \"16150433-db11-4dd4-bab9-1806f975a99b\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.969327 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-proxy-ca-bundles\") pod \"16150433-db11-4dd4-bab9-1806f975a99b\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.969389 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-config\") pod \"16150433-db11-4dd4-bab9-1806f975a99b\" (UID: \"16150433-db11-4dd4-bab9-1806f975a99b\") " Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.970034 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-client-ca" (OuterVolumeSpecName: "client-ca") pod "16150433-db11-4dd4-bab9-1806f975a99b" (UID: "16150433-db11-4dd4-bab9-1806f975a99b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.970055 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-config" (OuterVolumeSpecName: "config") pod "16150433-db11-4dd4-bab9-1806f975a99b" (UID: "16150433-db11-4dd4-bab9-1806f975a99b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.970345 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "16150433-db11-4dd4-bab9-1806f975a99b" (UID: "16150433-db11-4dd4-bab9-1806f975a99b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.974195 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16150433-db11-4dd4-bab9-1806f975a99b-kube-api-access-rmf24" (OuterVolumeSpecName: "kube-api-access-rmf24") pod "16150433-db11-4dd4-bab9-1806f975a99b" (UID: "16150433-db11-4dd4-bab9-1806f975a99b"). InnerVolumeSpecName "kube-api-access-rmf24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:57:47 crc kubenswrapper[4794]: I1215 13:57:47.974265 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16150433-db11-4dd4-bab9-1806f975a99b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16150433-db11-4dd4-bab9-1806f975a99b" (UID: "16150433-db11-4dd4-bab9-1806f975a99b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.070338 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-config\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.070381 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/582d1838-fd37-42aa-aef1-bccb8f055b00-serving-cert\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.070417 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbdww\" (UniqueName: \"kubernetes.io/projected/582d1838-fd37-42aa-aef1-bccb8f055b00-kube-api-access-rbdww\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.070449 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-proxy-ca-bundles\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.070466 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-client-ca\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.070557 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmf24\" (UniqueName: \"kubernetes.io/projected/16150433-db11-4dd4-bab9-1806f975a99b-kube-api-access-rmf24\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.070568 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16150433-db11-4dd4-bab9-1806f975a99b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.070589 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.070600 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.070609 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16150433-db11-4dd4-bab9-1806f975a99b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.171892 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/582d1838-fd37-42aa-aef1-bccb8f055b00-serving-cert\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.171938 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-config\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.171972 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbdww\" (UniqueName: \"kubernetes.io/projected/582d1838-fd37-42aa-aef1-bccb8f055b00-kube-api-access-rbdww\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.172002 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-proxy-ca-bundles\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.172027 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-client-ca\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.173058 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-client-ca\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.174041 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-config\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.174507 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-proxy-ca-bundles\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.177675 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/582d1838-fd37-42aa-aef1-bccb8f055b00-serving-cert\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.192059 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbdww\" (UniqueName: \"kubernetes.io/projected/582d1838-fd37-42aa-aef1-bccb8f055b00-kube-api-access-rbdww\") pod \"controller-manager-5756b84b69-dskm5\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.253014 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.588025 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snr54" event={"ID":"e7c17057-4b6f-40ed-8901-b8e6249317ed","Type":"ContainerStarted","Data":"19b0c6491fb79b59559fe8e3a21705f9ba3355134fbb1e6469ffc24ad2e4d1a6"} Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.590663 4794 generic.go:334] "Generic (PLEG): container finished" podID="67541105-959c-43f2-afe5-672f43498311" containerID="306edf36061fa388b100d7b3402ae38a06fdda840cca880ed44bc76607073f62" exitCode=0 Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.590734 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"67541105-959c-43f2-afe5-672f43498311","Type":"ContainerDied","Data":"306edf36061fa388b100d7b3402ae38a06fdda840cca880ed44bc76607073f62"} Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.594395 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fxvm" event={"ID":"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b","Type":"ContainerStarted","Data":"c291ce0c90d0eefbbab8eed54d506c3deecd95a0651632e0602768eecac4a077"} Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.597421 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45fd189c-2cde-48f0-8b4c-364e646fd81b","Type":"ContainerStarted","Data":"cca4d5b5563adf0d6835dd138fcbc85b218d2dc660b31e70d389e73a3fade8e0"} Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.599985 4794 generic.go:334] "Generic (PLEG): container finished" podID="16150433-db11-4dd4-bab9-1806f975a99b" containerID="edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa" exitCode=0 Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.600712 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" event={"ID":"16150433-db11-4dd4-bab9-1806f975a99b","Type":"ContainerDied","Data":"edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa"} Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.600962 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" event={"ID":"16150433-db11-4dd4-bab9-1806f975a99b","Type":"ContainerDied","Data":"4987e65c35762f22bb133c9bbc947c21bdbc6f354077df9d9eb0c85dc7d66c5a"} Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.601002 4794 scope.go:117] "RemoveContainer" containerID="edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.601568 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.604476 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-snr54" podStartSLOduration=3.419222278 podStartE2EDuration="1m11.604456176s" podCreationTimestamp="2025-12-15 13:56:37 +0000 UTC" firstStartedPulling="2025-12-15 13:56:39.994731876 +0000 UTC m=+161.846754314" lastFinishedPulling="2025-12-15 13:57:48.179965764 +0000 UTC m=+230.031988212" observedRunningTime="2025-12-15 13:57:48.603989033 +0000 UTC m=+230.456011511" watchObservedRunningTime="2025-12-15 13:57:48.604456176 +0000 UTC m=+230.456478614" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.626552 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6fxvm" podStartSLOduration=4.51933328 podStartE2EDuration="1m12.626529467s" podCreationTimestamp="2025-12-15 13:56:36 +0000 UTC" firstStartedPulling="2025-12-15 13:56:40.013601893 +0000 UTC m=+161.865624331" lastFinishedPulling="2025-12-15 13:57:48.12079808 +0000 UTC m=+229.972820518" observedRunningTime="2025-12-15 13:57:48.624609863 +0000 UTC m=+230.476632321" watchObservedRunningTime="2025-12-15 13:57:48.626529467 +0000 UTC m=+230.478551905" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.630847 4794 scope.go:117] "RemoveContainer" containerID="edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa" Dec 15 13:57:48 crc kubenswrapper[4794]: E1215 13:57:48.631456 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa\": container with ID starting with edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa not found: ID does not exist" containerID="edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.631497 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa"} err="failed to get container status \"edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa\": rpc error: code = NotFound desc = could not find container \"edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa\": container with ID starting with edee0219a25032fa7832e04c6dd2d5c953ac4ee6ae07973ab1a4a82a568cc7aa not found: ID does not exist" Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.668287 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5756b84b69-dskm5"] Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.669439 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=31.669394602 podStartE2EDuration="31.669394602s" podCreationTimestamp="2025-12-15 13:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:57:48.661518981 +0000 UTC m=+230.513541419" watchObservedRunningTime="2025-12-15 13:57:48.669394602 +0000 UTC m=+230.521417050" Dec 15 13:57:48 crc kubenswrapper[4794]: W1215 13:57:48.675521 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod582d1838_fd37_42aa_aef1_bccb8f055b00.slice/crio-d838815b1b5babb767af12e1b6647cd48aa551a67e7cede6541065e46db24fc8 WatchSource:0}: Error finding container d838815b1b5babb767af12e1b6647cd48aa551a67e7cede6541065e46db24fc8: Status 404 returned error can't find the container with id d838815b1b5babb767af12e1b6647cd48aa551a67e7cede6541065e46db24fc8 Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.679814 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9"] Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.683595 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-79c8ff59f4-5gdk9"] Dec 15 13:57:48 crc kubenswrapper[4794]: I1215 13:57:48.756100 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16150433-db11-4dd4-bab9-1806f975a99b" path="/var/lib/kubelet/pods/16150433-db11-4dd4-bab9-1806f975a99b/volumes" Dec 15 13:57:49 crc kubenswrapper[4794]: I1215 13:57:49.609131 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" event={"ID":"582d1838-fd37-42aa-aef1-bccb8f055b00","Type":"ContainerStarted","Data":"786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e"} Dec 15 13:57:49 crc kubenswrapper[4794]: I1215 13:57:49.609877 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:49 crc kubenswrapper[4794]: I1215 13:57:49.609905 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" event={"ID":"582d1838-fd37-42aa-aef1-bccb8f055b00","Type":"ContainerStarted","Data":"d838815b1b5babb767af12e1b6647cd48aa551a67e7cede6541065e46db24fc8"} Dec 15 13:57:49 crc kubenswrapper[4794]: I1215 13:57:49.615120 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:57:49 crc kubenswrapper[4794]: I1215 13:57:49.644202 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" podStartSLOduration=40.644180944 podStartE2EDuration="40.644180944s" podCreationTimestamp="2025-12-15 13:57:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:57:49.627355451 +0000 UTC m=+231.479377939" watchObservedRunningTime="2025-12-15 13:57:49.644180944 +0000 UTC m=+231.496203402" Dec 15 13:57:49 crc kubenswrapper[4794]: I1215 13:57:49.871641 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 15 13:57:49 crc kubenswrapper[4794]: I1215 13:57:49.998488 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67541105-959c-43f2-afe5-672f43498311-kube-api-access\") pod \"67541105-959c-43f2-afe5-672f43498311\" (UID: \"67541105-959c-43f2-afe5-672f43498311\") " Dec 15 13:57:49 crc kubenswrapper[4794]: I1215 13:57:49.998607 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67541105-959c-43f2-afe5-672f43498311-kubelet-dir\") pod \"67541105-959c-43f2-afe5-672f43498311\" (UID: \"67541105-959c-43f2-afe5-672f43498311\") " Dec 15 13:57:49 crc kubenswrapper[4794]: I1215 13:57:49.998764 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67541105-959c-43f2-afe5-672f43498311-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "67541105-959c-43f2-afe5-672f43498311" (UID: "67541105-959c-43f2-afe5-672f43498311"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:57:49 crc kubenswrapper[4794]: I1215 13:57:49.998918 4794 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67541105-959c-43f2-afe5-672f43498311-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:50 crc kubenswrapper[4794]: I1215 13:57:50.015230 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67541105-959c-43f2-afe5-672f43498311-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "67541105-959c-43f2-afe5-672f43498311" (UID: "67541105-959c-43f2-afe5-672f43498311"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:57:50 crc kubenswrapper[4794]: I1215 13:57:50.100416 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67541105-959c-43f2-afe5-672f43498311-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 15 13:57:50 crc kubenswrapper[4794]: I1215 13:57:50.614653 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 15 13:57:50 crc kubenswrapper[4794]: I1215 13:57:50.614815 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"67541105-959c-43f2-afe5-672f43498311","Type":"ContainerDied","Data":"c4486716d34c470099f459d9e96fdd10560abdf4ef8c577a90d601bfde17d2dc"} Dec 15 13:57:50 crc kubenswrapper[4794]: I1215 13:57:50.615283 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4486716d34c470099f459d9e96fdd10560abdf4ef8c577a90d601bfde17d2dc" Dec 15 13:57:55 crc kubenswrapper[4794]: I1215 13:57:55.652287 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb6h5" event={"ID":"f6610ca9-db3b-4fe8-80bb-9b35208e00b2","Type":"ContainerStarted","Data":"2204dfa6dc6bf1087ba6bbc8c94e28e60ccd0534a5e450c813926518531749ba"} Dec 15 13:57:56 crc kubenswrapper[4794]: I1215 13:57:56.665514 4794 generic.go:334] "Generic (PLEG): container finished" podID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" containerID="2204dfa6dc6bf1087ba6bbc8c94e28e60ccd0534a5e450c813926518531749ba" exitCode=0 Dec 15 13:57:56 crc kubenswrapper[4794]: I1215 13:57:56.665622 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb6h5" event={"ID":"f6610ca9-db3b-4fe8-80bb-9b35208e00b2","Type":"ContainerDied","Data":"2204dfa6dc6bf1087ba6bbc8c94e28e60ccd0534a5e450c813926518531749ba"} Dec 15 13:57:56 crc kubenswrapper[4794]: I1215 13:57:56.667797 4794 generic.go:334] "Generic (PLEG): container finished" podID="e64ae040-f93d-4dcc-8dce-812b127b0630" containerID="b78270170dc1985d94193370da58c8a972dbbfc5ae665327f421e1943e069ec3" exitCode=0 Dec 15 13:57:56 crc kubenswrapper[4794]: I1215 13:57:56.667834 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj6qt" event={"ID":"e64ae040-f93d-4dcc-8dce-812b127b0630","Type":"ContainerDied","Data":"b78270170dc1985d94193370da58c8a972dbbfc5ae665327f421e1943e069ec3"} Dec 15 13:57:56 crc kubenswrapper[4794]: I1215 13:57:56.877987 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:57:56 crc kubenswrapper[4794]: I1215 13:57:56.878176 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:57:56 crc kubenswrapper[4794]: I1215 13:57:56.936758 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:57:57 crc kubenswrapper[4794]: I1215 13:57:57.676368 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb6h5" event={"ID":"f6610ca9-db3b-4fe8-80bb-9b35208e00b2","Type":"ContainerStarted","Data":"74711175d483ed24277afa04ff24b6432f70d43aadf53480a008f530720d2d58"} Dec 15 13:57:57 crc kubenswrapper[4794]: I1215 13:57:57.682493 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj6qt" event={"ID":"e64ae040-f93d-4dcc-8dce-812b127b0630","Type":"ContainerStarted","Data":"4e563c0ec966978662106c7f60108c8d6cdba60cd75fdd5011a1c12c12288c79"} Dec 15 13:57:57 crc kubenswrapper[4794]: I1215 13:57:57.703833 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xb6h5" podStartSLOduration=2.594581192 podStartE2EDuration="1m23.703803049s" podCreationTimestamp="2025-12-15 13:56:34 +0000 UTC" firstStartedPulling="2025-12-15 13:56:36.07624272 +0000 UTC m=+157.928265148" lastFinishedPulling="2025-12-15 13:57:57.185464567 +0000 UTC m=+239.037487005" observedRunningTime="2025-12-15 13:57:57.703424038 +0000 UTC m=+239.555446486" watchObservedRunningTime="2025-12-15 13:57:57.703803049 +0000 UTC m=+239.555825487" Dec 15 13:57:57 crc kubenswrapper[4794]: I1215 13:57:57.758105 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hj6qt" podStartSLOduration=2.541818623 podStartE2EDuration="1m23.758080245s" podCreationTimestamp="2025-12-15 13:56:34 +0000 UTC" firstStartedPulling="2025-12-15 13:56:36.087142823 +0000 UTC m=+157.939165271" lastFinishedPulling="2025-12-15 13:57:57.303404445 +0000 UTC m=+239.155426893" observedRunningTime="2025-12-15 13:57:57.720623472 +0000 UTC m=+239.572645910" watchObservedRunningTime="2025-12-15 13:57:57.758080245 +0000 UTC m=+239.610102683" Dec 15 13:57:57 crc kubenswrapper[4794]: I1215 13:57:57.783311 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:57:57 crc kubenswrapper[4794]: I1215 13:57:57.783380 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:57:57 crc kubenswrapper[4794]: I1215 13:57:57.823799 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:57:58 crc kubenswrapper[4794]: I1215 13:57:58.369067 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:57:58 crc kubenswrapper[4794]: I1215 13:57:58.735531 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-snr54" Dec 15 13:57:59 crc kubenswrapper[4794]: I1215 13:57:59.168058 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fxvm"] Dec 15 13:58:00 crc kubenswrapper[4794]: I1215 13:58:00.701501 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6fxvm" podUID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" containerName="registry-server" containerID="cri-o://c291ce0c90d0eefbbab8eed54d506c3deecd95a0651632e0602768eecac4a077" gracePeriod=2 Dec 15 13:58:01 crc kubenswrapper[4794]: I1215 13:58:01.707566 4794 generic.go:334] "Generic (PLEG): container finished" podID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" containerID="c291ce0c90d0eefbbab8eed54d506c3deecd95a0651632e0602768eecac4a077" exitCode=0 Dec 15 13:58:01 crc kubenswrapper[4794]: I1215 13:58:01.707636 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fxvm" event={"ID":"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b","Type":"ContainerDied","Data":"c291ce0c90d0eefbbab8eed54d506c3deecd95a0651632e0602768eecac4a077"} Dec 15 13:58:02 crc kubenswrapper[4794]: I1215 13:58:02.832763 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:58:02 crc kubenswrapper[4794]: I1215 13:58:02.890299 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vpnv\" (UniqueName: \"kubernetes.io/projected/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-kube-api-access-8vpnv\") pod \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " Dec 15 13:58:02 crc kubenswrapper[4794]: I1215 13:58:02.890374 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-catalog-content\") pod \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " Dec 15 13:58:02 crc kubenswrapper[4794]: I1215 13:58:02.890443 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-utilities\") pod \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\" (UID: \"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b\") " Dec 15 13:58:02 crc kubenswrapper[4794]: I1215 13:58:02.891239 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-utilities" (OuterVolumeSpecName: "utilities") pod "3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" (UID: "3b4751fc-ae26-4b2b-bab6-b99534e1ea1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:58:02 crc kubenswrapper[4794]: I1215 13:58:02.896838 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-kube-api-access-8vpnv" (OuterVolumeSpecName: "kube-api-access-8vpnv") pod "3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" (UID: "3b4751fc-ae26-4b2b-bab6-b99534e1ea1b"). InnerVolumeSpecName "kube-api-access-8vpnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:58:02 crc kubenswrapper[4794]: I1215 13:58:02.915666 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" (UID: "3b4751fc-ae26-4b2b-bab6-b99534e1ea1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:58:02 crc kubenswrapper[4794]: I1215 13:58:02.991796 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:02 crc kubenswrapper[4794]: I1215 13:58:02.991830 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vpnv\" (UniqueName: \"kubernetes.io/projected/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-kube-api-access-8vpnv\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:02 crc kubenswrapper[4794]: I1215 13:58:02.991844 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.719122 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fxvm" event={"ID":"3b4751fc-ae26-4b2b-bab6-b99534e1ea1b","Type":"ContainerDied","Data":"5f01efbb8fa7cf9a7944cf5c43adcbba2bea609d437c28e2001a1817156aa0dc"} Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.719331 4794 scope.go:117] "RemoveContainer" containerID="c291ce0c90d0eefbbab8eed54d506c3deecd95a0651632e0602768eecac4a077" Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.719346 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fxvm" Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.726905 4794 generic.go:334] "Generic (PLEG): container finished" podID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerID="0a9197a1e1f19beedfade9c2303382068428698d457bd677fd7ac027c53ba183" exitCode=0 Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.726957 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwlx2" event={"ID":"480d7d96-d156-4f7c-8db0-11a23c26149b","Type":"ContainerDied","Data":"0a9197a1e1f19beedfade9c2303382068428698d457bd677fd7ac027c53ba183"} Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.732954 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvh" event={"ID":"9f11449c-f57e-46af-bd65-ed450d85ba60","Type":"ContainerDied","Data":"43c3e82725bb5aa95a7d79834af725b011c0a4c42441c2230b230702b8168559"} Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.733053 4794 generic.go:334] "Generic (PLEG): container finished" podID="9f11449c-f57e-46af-bd65-ed450d85ba60" containerID="43c3e82725bb5aa95a7d79834af725b011c0a4c42441c2230b230702b8168559" exitCode=0 Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.735635 4794 generic.go:334] "Generic (PLEG): container finished" podID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerID="49720b1553923d4115e580ca53ad863323880f291c7ba116687bea7a8a5a9972" exitCode=0 Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.735660 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llmhn" event={"ID":"44b3d3fc-c9d0-4487-b176-26ef1f32de15","Type":"ContainerDied","Data":"49720b1553923d4115e580ca53ad863323880f291c7ba116687bea7a8a5a9972"} Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.778309 4794 scope.go:117] "RemoveContainer" containerID="35fcf9c3611569153696420a654f30a1a3b4dab0ee283e36fd54d01a1113018b" Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.800392 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fxvm"] Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.802753 4794 scope.go:117] "RemoveContainer" containerID="241704ec353f3006e2cc0eb4d7fd57828e77c97b2e0fe662ff8df1448d89d22c" Dec 15 13:58:03 crc kubenswrapper[4794]: I1215 13:58:03.805440 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fxvm"] Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.497876 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.498304 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.540143 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.744134 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" path="/var/lib/kubelet/pods/3b4751fc-ae26-4b2b-bab6-b99534e1ea1b/volumes" Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.744652 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvh" event={"ID":"9f11449c-f57e-46af-bd65-ed450d85ba60","Type":"ContainerStarted","Data":"f620805ee7c2db4ed0acd4ca5a446043321fa203315478f9b1f4ba13fa838bbc"} Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.744955 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llmhn" event={"ID":"44b3d3fc-c9d0-4487-b176-26ef1f32de15","Type":"ContainerStarted","Data":"bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b"} Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.747560 4794 generic.go:334] "Generic (PLEG): container finished" podID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" containerID="1ad01068926b4c29d1a4606e101bc7d8406779049135df3d9ca855e4bbea7021" exitCode=0 Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.747624 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lkl6" event={"ID":"4e69f94d-9c18-4b19-8290-5e2d86ab4bae","Type":"ContainerDied","Data":"1ad01068926b4c29d1a4606e101bc7d8406779049135df3d9ca855e4bbea7021"} Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.750929 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwlx2" event={"ID":"480d7d96-d156-4f7c-8db0-11a23c26149b","Type":"ContainerStarted","Data":"9d156abbcb34e6984e15897bad5368a46adced1dbfc945560aeb3693ed3270a0"} Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.772041 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5sjvh" podStartSLOduration=4.168979424 podStartE2EDuration="1m28.772024124s" podCreationTimestamp="2025-12-15 13:56:36 +0000 UTC" firstStartedPulling="2025-12-15 13:56:39.974068371 +0000 UTC m=+161.826090809" lastFinishedPulling="2025-12-15 13:58:04.577113071 +0000 UTC m=+246.429135509" observedRunningTime="2025-12-15 13:58:04.767113716 +0000 UTC m=+246.619136154" watchObservedRunningTime="2025-12-15 13:58:04.772024124 +0000 UTC m=+246.624046562" Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.807904 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.811314 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zwlx2" podStartSLOduration=3.5655631679999997 podStartE2EDuration="1m27.811300529s" podCreationTimestamp="2025-12-15 13:56:37 +0000 UTC" firstStartedPulling="2025-12-15 13:56:40.015126094 +0000 UTC m=+161.867148532" lastFinishedPulling="2025-12-15 13:58:04.260863435 +0000 UTC m=+246.112885893" observedRunningTime="2025-12-15 13:58:04.810468616 +0000 UTC m=+246.662491054" watchObservedRunningTime="2025-12-15 13:58:04.811300529 +0000 UTC m=+246.663322977" Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.812400 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-llmhn" podStartSLOduration=2.669869493 podStartE2EDuration="1m30.81239336s" podCreationTimestamp="2025-12-15 13:56:34 +0000 UTC" firstStartedPulling="2025-12-15 13:56:36.080848704 +0000 UTC m=+157.932871162" lastFinishedPulling="2025-12-15 13:58:04.223372581 +0000 UTC m=+246.075395029" observedRunningTime="2025-12-15 13:58:04.786697957 +0000 UTC m=+246.638720395" watchObservedRunningTime="2025-12-15 13:58:04.81239336 +0000 UTC m=+246.664415788" Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.909965 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.910063 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:58:04 crc kubenswrapper[4794]: I1215 13:58:04.955873 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:58:05 crc kubenswrapper[4794]: I1215 13:58:05.176434 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:58:05 crc kubenswrapper[4794]: I1215 13:58:05.176477 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:58:05 crc kubenswrapper[4794]: I1215 13:58:05.757287 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lkl6" event={"ID":"4e69f94d-9c18-4b19-8290-5e2d86ab4bae","Type":"ContainerStarted","Data":"9b0713c45dbf8089905f1a8d77cb4732e668e2a79d474b12c71e1be4c5e3225c"} Dec 15 13:58:05 crc kubenswrapper[4794]: I1215 13:58:05.777760 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2lkl6" podStartSLOduration=2.443532025 podStartE2EDuration="1m31.777742837s" podCreationTimestamp="2025-12-15 13:56:34 +0000 UTC" firstStartedPulling="2025-12-15 13:56:36.082537679 +0000 UTC m=+157.934560137" lastFinishedPulling="2025-12-15 13:58:05.416748511 +0000 UTC m=+247.268770949" observedRunningTime="2025-12-15 13:58:05.775766881 +0000 UTC m=+247.627789339" watchObservedRunningTime="2025-12-15 13:58:05.777742837 +0000 UTC m=+247.629765275" Dec 15 13:58:05 crc kubenswrapper[4794]: I1215 13:58:05.800459 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:58:06 crc kubenswrapper[4794]: I1215 13:58:06.209985 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-llmhn" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerName="registry-server" probeResult="failure" output=< Dec 15 13:58:06 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Dec 15 13:58:06 crc kubenswrapper[4794]: > Dec 15 13:58:06 crc kubenswrapper[4794]: I1215 13:58:06.483437 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:58:06 crc kubenswrapper[4794]: I1215 13:58:06.483754 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:58:06 crc kubenswrapper[4794]: I1215 13:58:06.531036 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:58:07 crc kubenswrapper[4794]: I1215 13:58:07.564482 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xb6h5"] Dec 15 13:58:08 crc kubenswrapper[4794]: I1215 13:58:08.087910 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:58:08 crc kubenswrapper[4794]: I1215 13:58:08.087966 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:58:08 crc kubenswrapper[4794]: I1215 13:58:08.772593 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xb6h5" podUID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" containerName="registry-server" containerID="cri-o://74711175d483ed24277afa04ff24b6432f70d43aadf53480a008f530720d2d58" gracePeriod=2 Dec 15 13:58:09 crc kubenswrapper[4794]: I1215 13:58:09.126666 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zwlx2" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerName="registry-server" probeResult="failure" output=< Dec 15 13:58:09 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Dec 15 13:58:09 crc kubenswrapper[4794]: > Dec 15 13:58:09 crc kubenswrapper[4794]: I1215 13:58:09.287672 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5756b84b69-dskm5"] Dec 15 13:58:09 crc kubenswrapper[4794]: I1215 13:58:09.287863 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" podUID="582d1838-fd37-42aa-aef1-bccb8f055b00" containerName="controller-manager" containerID="cri-o://786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e" gracePeriod=30 Dec 15 13:58:09 crc kubenswrapper[4794]: I1215 13:58:09.387398 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm"] Dec 15 13:58:09 crc kubenswrapper[4794]: I1215 13:58:09.387625 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" podUID="67c2996f-5b16-4d51-aeaa-004b908e3dde" containerName="route-controller-manager" containerID="cri-o://a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7" gracePeriod=30 Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.508211 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.536149 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8c77c5b59-hdwz6"] Dec 15 13:58:12 crc kubenswrapper[4794]: E1215 13:58:12.536372 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582d1838-fd37-42aa-aef1-bccb8f055b00" containerName="controller-manager" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.536387 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="582d1838-fd37-42aa-aef1-bccb8f055b00" containerName="controller-manager" Dec 15 13:58:12 crc kubenswrapper[4794]: E1215 13:58:12.536402 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" containerName="extract-content" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.536409 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" containerName="extract-content" Dec 15 13:58:12 crc kubenswrapper[4794]: E1215 13:58:12.536421 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67541105-959c-43f2-afe5-672f43498311" containerName="pruner" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.536427 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="67541105-959c-43f2-afe5-672f43498311" containerName="pruner" Dec 15 13:58:12 crc kubenswrapper[4794]: E1215 13:58:12.536451 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" containerName="extract-utilities" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.536458 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" containerName="extract-utilities" Dec 15 13:58:12 crc kubenswrapper[4794]: E1215 13:58:12.536466 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" containerName="registry-server" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.536473 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" containerName="registry-server" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.536567 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="582d1838-fd37-42aa-aef1-bccb8f055b00" containerName="controller-manager" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.536601 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b4751fc-ae26-4b2b-bab6-b99534e1ea1b" containerName="registry-server" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.536614 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="67541105-959c-43f2-afe5-672f43498311" containerName="pruner" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.537206 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.549628 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8c77c5b59-hdwz6"] Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.655560 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbdww\" (UniqueName: \"kubernetes.io/projected/582d1838-fd37-42aa-aef1-bccb8f055b00-kube-api-access-rbdww\") pod \"582d1838-fd37-42aa-aef1-bccb8f055b00\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.655631 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-config\") pod \"582d1838-fd37-42aa-aef1-bccb8f055b00\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.655667 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-proxy-ca-bundles\") pod \"582d1838-fd37-42aa-aef1-bccb8f055b00\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.655689 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/582d1838-fd37-42aa-aef1-bccb8f055b00-serving-cert\") pod \"582d1838-fd37-42aa-aef1-bccb8f055b00\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.655746 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-client-ca\") pod \"582d1838-fd37-42aa-aef1-bccb8f055b00\" (UID: \"582d1838-fd37-42aa-aef1-bccb8f055b00\") " Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.655921 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ca1b031-98e5-47b0-87da-ca66cee361c3-serving-cert\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.655958 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62wtm\" (UniqueName: \"kubernetes.io/projected/5ca1b031-98e5-47b0-87da-ca66cee361c3-kube-api-access-62wtm\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.656021 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-config\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.656039 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-client-ca\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.656074 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-proxy-ca-bundles\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.656513 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "582d1838-fd37-42aa-aef1-bccb8f055b00" (UID: "582d1838-fd37-42aa-aef1-bccb8f055b00"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.656635 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-config" (OuterVolumeSpecName: "config") pod "582d1838-fd37-42aa-aef1-bccb8f055b00" (UID: "582d1838-fd37-42aa-aef1-bccb8f055b00"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.656923 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-client-ca" (OuterVolumeSpecName: "client-ca") pod "582d1838-fd37-42aa-aef1-bccb8f055b00" (UID: "582d1838-fd37-42aa-aef1-bccb8f055b00"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.660351 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582d1838-fd37-42aa-aef1-bccb8f055b00-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "582d1838-fd37-42aa-aef1-bccb8f055b00" (UID: "582d1838-fd37-42aa-aef1-bccb8f055b00"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.663702 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/582d1838-fd37-42aa-aef1-bccb8f055b00-kube-api-access-rbdww" (OuterVolumeSpecName: "kube-api-access-rbdww") pod "582d1838-fd37-42aa-aef1-bccb8f055b00" (UID: "582d1838-fd37-42aa-aef1-bccb8f055b00"). InnerVolumeSpecName "kube-api-access-rbdww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.682727 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757300 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67c2996f-5b16-4d51-aeaa-004b908e3dde-serving-cert\") pod \"67c2996f-5b16-4d51-aeaa-004b908e3dde\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757382 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgnpm\" (UniqueName: \"kubernetes.io/projected/67c2996f-5b16-4d51-aeaa-004b908e3dde-kube-api-access-jgnpm\") pod \"67c2996f-5b16-4d51-aeaa-004b908e3dde\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757409 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-client-ca\") pod \"67c2996f-5b16-4d51-aeaa-004b908e3dde\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757432 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-config\") pod \"67c2996f-5b16-4d51-aeaa-004b908e3dde\" (UID: \"67c2996f-5b16-4d51-aeaa-004b908e3dde\") " Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757628 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-config\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757653 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-client-ca\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757698 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-proxy-ca-bundles\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757737 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ca1b031-98e5-47b0-87da-ca66cee361c3-serving-cert\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757762 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62wtm\" (UniqueName: \"kubernetes.io/projected/5ca1b031-98e5-47b0-87da-ca66cee361c3-kube-api-access-62wtm\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757855 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757869 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbdww\" (UniqueName: \"kubernetes.io/projected/582d1838-fd37-42aa-aef1-bccb8f055b00-kube-api-access-rbdww\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757879 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757890 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/582d1838-fd37-42aa-aef1-bccb8f055b00-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.757910 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/582d1838-fd37-42aa-aef1-bccb8f055b00-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.759467 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-client-ca" (OuterVolumeSpecName: "client-ca") pod "67c2996f-5b16-4d51-aeaa-004b908e3dde" (UID: "67c2996f-5b16-4d51-aeaa-004b908e3dde"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.759515 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-config" (OuterVolumeSpecName: "config") pod "67c2996f-5b16-4d51-aeaa-004b908e3dde" (UID: "67c2996f-5b16-4d51-aeaa-004b908e3dde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.760027 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-proxy-ca-bundles\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.760060 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-client-ca\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.760221 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-config\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.768743 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c2996f-5b16-4d51-aeaa-004b908e3dde-kube-api-access-jgnpm" (OuterVolumeSpecName: "kube-api-access-jgnpm") pod "67c2996f-5b16-4d51-aeaa-004b908e3dde" (UID: "67c2996f-5b16-4d51-aeaa-004b908e3dde"). InnerVolumeSpecName "kube-api-access-jgnpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.768754 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c2996f-5b16-4d51-aeaa-004b908e3dde-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67c2996f-5b16-4d51-aeaa-004b908e3dde" (UID: "67c2996f-5b16-4d51-aeaa-004b908e3dde"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.769473 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ca1b031-98e5-47b0-87da-ca66cee361c3-serving-cert\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.772816 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62wtm\" (UniqueName: \"kubernetes.io/projected/5ca1b031-98e5-47b0-87da-ca66cee361c3-kube-api-access-62wtm\") pod \"controller-manager-8c77c5b59-hdwz6\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.791375 4794 generic.go:334] "Generic (PLEG): container finished" podID="67c2996f-5b16-4d51-aeaa-004b908e3dde" containerID="a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7" exitCode=0 Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.791407 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.791450 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" event={"ID":"67c2996f-5b16-4d51-aeaa-004b908e3dde","Type":"ContainerDied","Data":"a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7"} Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.791534 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm" event={"ID":"67c2996f-5b16-4d51-aeaa-004b908e3dde","Type":"ContainerDied","Data":"342296ae658b4242d906063e1307a375afcb70db09a370e03995886487ba84f6"} Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.791568 4794 scope.go:117] "RemoveContainer" containerID="a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.793656 4794 generic.go:334] "Generic (PLEG): container finished" podID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" containerID="74711175d483ed24277afa04ff24b6432f70d43aadf53480a008f530720d2d58" exitCode=0 Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.793707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb6h5" event={"ID":"f6610ca9-db3b-4fe8-80bb-9b35208e00b2","Type":"ContainerDied","Data":"74711175d483ed24277afa04ff24b6432f70d43aadf53480a008f530720d2d58"} Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.795576 4794 generic.go:334] "Generic (PLEG): container finished" podID="582d1838-fd37-42aa-aef1-bccb8f055b00" containerID="786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e" exitCode=0 Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.795619 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" event={"ID":"582d1838-fd37-42aa-aef1-bccb8f055b00","Type":"ContainerDied","Data":"786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e"} Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.795639 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" event={"ID":"582d1838-fd37-42aa-aef1-bccb8f055b00","Type":"ContainerDied","Data":"d838815b1b5babb767af12e1b6647cd48aa551a67e7cede6541065e46db24fc8"} Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.795717 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5756b84b69-dskm5" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.810373 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5756b84b69-dskm5"] Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.812832 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5756b84b69-dskm5"] Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.816814 4794 scope.go:117] "RemoveContainer" containerID="a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7" Dec 15 13:58:12 crc kubenswrapper[4794]: E1215 13:58:12.817448 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7\": container with ID starting with a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7 not found: ID does not exist" containerID="a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.817483 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7"} err="failed to get container status \"a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7\": rpc error: code = NotFound desc = could not find container \"a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7\": container with ID starting with a9279149a3e4b4841dcc54ee5783dc516bb248db50a43b0af7bf92761fed56e7 not found: ID does not exist" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.817528 4794 scope.go:117] "RemoveContainer" containerID="786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.819653 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm"] Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.822234 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b48b9cf5-sb6sm"] Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.832924 4794 scope.go:117] "RemoveContainer" containerID="786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e" Dec 15 13:58:12 crc kubenswrapper[4794]: E1215 13:58:12.833361 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e\": container with ID starting with 786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e not found: ID does not exist" containerID="786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.833407 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e"} err="failed to get container status \"786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e\": rpc error: code = NotFound desc = could not find container \"786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e\": container with ID starting with 786b23b178c4de427be901f0ccaff919a0541b39470725f28f703800e96a3f4e not found: ID does not exist" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.853939 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.859219 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67c2996f-5b16-4d51-aeaa-004b908e3dde-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.859240 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgnpm\" (UniqueName: \"kubernetes.io/projected/67c2996f-5b16-4d51-aeaa-004b908e3dde-kube-api-access-jgnpm\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.859249 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:12 crc kubenswrapper[4794]: I1215 13:58:12.859259 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c2996f-5b16-4d51-aeaa-004b908e3dde-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.057769 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.103267 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8c77c5b59-hdwz6"] Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.162247 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-catalog-content\") pod \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.162632 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-utilities\") pod \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.162755 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-868n4\" (UniqueName: \"kubernetes.io/projected/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-kube-api-access-868n4\") pod \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\" (UID: \"f6610ca9-db3b-4fe8-80bb-9b35208e00b2\") " Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.163576 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-utilities" (OuterVolumeSpecName: "utilities") pod "f6610ca9-db3b-4fe8-80bb-9b35208e00b2" (UID: "f6610ca9-db3b-4fe8-80bb-9b35208e00b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.169994 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-kube-api-access-868n4" (OuterVolumeSpecName: "kube-api-access-868n4") pod "f6610ca9-db3b-4fe8-80bb-9b35208e00b2" (UID: "f6610ca9-db3b-4fe8-80bb-9b35208e00b2"). InnerVolumeSpecName "kube-api-access-868n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.244522 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6610ca9-db3b-4fe8-80bb-9b35208e00b2" (UID: "f6610ca9-db3b-4fe8-80bb-9b35208e00b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.267048 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.267095 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-868n4\" (UniqueName: \"kubernetes.io/projected/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-kube-api-access-868n4\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.267123 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6610ca9-db3b-4fe8-80bb-9b35208e00b2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.809427 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" event={"ID":"5ca1b031-98e5-47b0-87da-ca66cee361c3","Type":"ContainerStarted","Data":"f9250d7eda278dd06d741168bd5f13fbfe1506db50bc6b74ca54d5d2eb995c95"} Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.809478 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" event={"ID":"5ca1b031-98e5-47b0-87da-ca66cee361c3","Type":"ContainerStarted","Data":"e3f4ed99ad6b5ae7d6dbbe0c0e279b090ccc1ac386617ee40ab34d7a05938c02"} Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.809905 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.815478 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb6h5" event={"ID":"f6610ca9-db3b-4fe8-80bb-9b35208e00b2","Type":"ContainerDied","Data":"d36068d2b54dd0b845313fbf532625080e202ff7cc50050c9ba9e9b933a24fb3"} Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.815524 4794 scope.go:117] "RemoveContainer" containerID="74711175d483ed24277afa04ff24b6432f70d43aadf53480a008f530720d2d58" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.815671 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xb6h5" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.815932 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.832319 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" podStartSLOduration=4.83230542 podStartE2EDuration="4.83230542s" podCreationTimestamp="2025-12-15 13:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:58:13.829241443 +0000 UTC m=+255.681263881" watchObservedRunningTime="2025-12-15 13:58:13.83230542 +0000 UTC m=+255.684327858" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.836091 4794 scope.go:117] "RemoveContainer" containerID="2204dfa6dc6bf1087ba6bbc8c94e28e60ccd0534a5e450c813926518531749ba" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.860778 4794 scope.go:117] "RemoveContainer" containerID="2984ca1a127b4c0e3b592a3e787756adb5ffc8b281183dfe3b189412b697548b" Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.865384 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xb6h5"] Dec 15 13:58:13 crc kubenswrapper[4794]: I1215 13:58:13.867680 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xb6h5"] Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.692370 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.692772 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.747913 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="582d1838-fd37-42aa-aef1-bccb8f055b00" path="/var/lib/kubelet/pods/582d1838-fd37-42aa-aef1-bccb8f055b00/volumes" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.748533 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c2996f-5b16-4d51-aeaa-004b908e3dde" path="/var/lib/kubelet/pods/67c2996f-5b16-4d51-aeaa-004b908e3dde/volumes" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.749275 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" path="/var/lib/kubelet/pods/f6610ca9-db3b-4fe8-80bb-9b35208e00b2/volumes" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.750498 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.879542 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn"] Dec 15 13:58:14 crc kubenswrapper[4794]: E1215 13:58:14.879861 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" containerName="extract-content" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.879878 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" containerName="extract-content" Dec 15 13:58:14 crc kubenswrapper[4794]: E1215 13:58:14.879897 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" containerName="extract-utilities" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.879905 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" containerName="extract-utilities" Dec 15 13:58:14 crc kubenswrapper[4794]: E1215 13:58:14.879925 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c2996f-5b16-4d51-aeaa-004b908e3dde" containerName="route-controller-manager" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.879933 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c2996f-5b16-4d51-aeaa-004b908e3dde" containerName="route-controller-manager" Dec 15 13:58:14 crc kubenswrapper[4794]: E1215 13:58:14.879952 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" containerName="registry-server" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.879959 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" containerName="registry-server" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.880081 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c2996f-5b16-4d51-aeaa-004b908e3dde" containerName="route-controller-manager" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.880093 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6610ca9-db3b-4fe8-80bb-9b35208e00b2" containerName="registry-server" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.880546 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.885514 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.886921 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.887220 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.888768 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.888878 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.888917 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.889020 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn"] Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.914029 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2lkl6" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.998241 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f4ac38e-f424-493a-8c65-3002f7ed1626-serving-cert\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.998318 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-config\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.998366 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx5gp\" (UniqueName: \"kubernetes.io/projected/7f4ac38e-f424-493a-8c65-3002f7ed1626-kube-api-access-bx5gp\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:14 crc kubenswrapper[4794]: I1215 13:58:14.998716 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-client-ca\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.099717 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f4ac38e-f424-493a-8c65-3002f7ed1626-serving-cert\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.099770 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-config\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.099804 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx5gp\" (UniqueName: \"kubernetes.io/projected/7f4ac38e-f424-493a-8c65-3002f7ed1626-kube-api-access-bx5gp\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.099912 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-client-ca\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.100990 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-client-ca\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.103550 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-config\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.111639 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f4ac38e-f424-493a-8c65-3002f7ed1626-serving-cert\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.141479 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx5gp\" (UniqueName: \"kubernetes.io/projected/7f4ac38e-f424-493a-8c65-3002f7ed1626-kube-api-access-bx5gp\") pod \"route-controller-manager-c79c46f4f-5s9xn\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.208661 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.229873 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.300764 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.688549 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn"] Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.773332 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llmhn"] Dec 15 13:58:15 crc kubenswrapper[4794]: I1215 13:58:15.833029 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" event={"ID":"7f4ac38e-f424-493a-8c65-3002f7ed1626","Type":"ContainerStarted","Data":"bd58b4b2233a1ed332aab6ebd408dd0b960459e1186dba3daec20365c484a40e"} Dec 15 13:58:16 crc kubenswrapper[4794]: I1215 13:58:16.539931 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 13:58:16 crc kubenswrapper[4794]: I1215 13:58:16.711355 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2rt5j"] Dec 15 13:58:16 crc kubenswrapper[4794]: I1215 13:58:16.838550 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" event={"ID":"7f4ac38e-f424-493a-8c65-3002f7ed1626","Type":"ContainerStarted","Data":"f128febb89135628d56f80402fcdfc96c099db1292e6461429306c8c0282d2d6"} Dec 15 13:58:16 crc kubenswrapper[4794]: I1215 13:58:16.838711 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-llmhn" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerName="registry-server" containerID="cri-o://bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b" gracePeriod=2 Dec 15 13:58:16 crc kubenswrapper[4794]: I1215 13:58:16.864775 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" podStartSLOduration=7.864749215 podStartE2EDuration="7.864749215s" podCreationTimestamp="2025-12-15 13:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:58:16.860699881 +0000 UTC m=+258.712722329" watchObservedRunningTime="2025-12-15 13:58:16.864749215 +0000 UTC m=+258.716771663" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.258253 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.337633 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpsp7\" (UniqueName: \"kubernetes.io/projected/44b3d3fc-c9d0-4487-b176-26ef1f32de15-kube-api-access-fpsp7\") pod \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.337711 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-utilities\") pod \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.337769 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-catalog-content\") pod \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\" (UID: \"44b3d3fc-c9d0-4487-b176-26ef1f32de15\") " Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.338842 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-utilities" (OuterVolumeSpecName: "utilities") pod "44b3d3fc-c9d0-4487-b176-26ef1f32de15" (UID: "44b3d3fc-c9d0-4487-b176-26ef1f32de15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.342448 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b3d3fc-c9d0-4487-b176-26ef1f32de15-kube-api-access-fpsp7" (OuterVolumeSpecName: "kube-api-access-fpsp7") pod "44b3d3fc-c9d0-4487-b176-26ef1f32de15" (UID: "44b3d3fc-c9d0-4487-b176-26ef1f32de15"). InnerVolumeSpecName "kube-api-access-fpsp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.386163 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44b3d3fc-c9d0-4487-b176-26ef1f32de15" (UID: "44b3d3fc-c9d0-4487-b176-26ef1f32de15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.438896 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.438938 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpsp7\" (UniqueName: \"kubernetes.io/projected/44b3d3fc-c9d0-4487-b176-26ef1f32de15-kube-api-access-fpsp7\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.438947 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44b3d3fc-c9d0-4487-b176-26ef1f32de15-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.846670 4794 generic.go:334] "Generic (PLEG): container finished" podID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerID="bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b" exitCode=0 Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.846708 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llmhn" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.846727 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llmhn" event={"ID":"44b3d3fc-c9d0-4487-b176-26ef1f32de15","Type":"ContainerDied","Data":"bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b"} Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.846803 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llmhn" event={"ID":"44b3d3fc-c9d0-4487-b176-26ef1f32de15","Type":"ContainerDied","Data":"c107824738f43a5e1e1ec93d040b7964c3fe57d62384cd9167ceb0c8aa8065d4"} Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.846827 4794 scope.go:117] "RemoveContainer" containerID="bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.847387 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.854526 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.864066 4794 scope.go:117] "RemoveContainer" containerID="49720b1553923d4115e580ca53ad863323880f291c7ba116687bea7a8a5a9972" Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.993391 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llmhn"] Dec 15 13:58:17 crc kubenswrapper[4794]: I1215 13:58:17.998999 4794 scope.go:117] "RemoveContainer" containerID="fecdc119653ca9fca0ed5c178aee79821b6479eb5a1d78203505dcaefd165307" Dec 15 13:58:18 crc kubenswrapper[4794]: I1215 13:58:18.000837 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-llmhn"] Dec 15 13:58:18 crc kubenswrapper[4794]: I1215 13:58:18.017151 4794 scope.go:117] "RemoveContainer" containerID="bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b" Dec 15 13:58:18 crc kubenswrapper[4794]: E1215 13:58:18.022135 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b\": container with ID starting with bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b not found: ID does not exist" containerID="bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b" Dec 15 13:58:18 crc kubenswrapper[4794]: I1215 13:58:18.022180 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b"} err="failed to get container status \"bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b\": rpc error: code = NotFound desc = could not find container \"bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b\": container with ID starting with bdc3f4baa4c82912a01c0d38f1b188579ace0f86d11de86f66557f24a31d805b not found: ID does not exist" Dec 15 13:58:18 crc kubenswrapper[4794]: I1215 13:58:18.022210 4794 scope.go:117] "RemoveContainer" containerID="49720b1553923d4115e580ca53ad863323880f291c7ba116687bea7a8a5a9972" Dec 15 13:58:18 crc kubenswrapper[4794]: E1215 13:58:18.022460 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49720b1553923d4115e580ca53ad863323880f291c7ba116687bea7a8a5a9972\": container with ID starting with 49720b1553923d4115e580ca53ad863323880f291c7ba116687bea7a8a5a9972 not found: ID does not exist" containerID="49720b1553923d4115e580ca53ad863323880f291c7ba116687bea7a8a5a9972" Dec 15 13:58:18 crc kubenswrapper[4794]: I1215 13:58:18.022480 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49720b1553923d4115e580ca53ad863323880f291c7ba116687bea7a8a5a9972"} err="failed to get container status \"49720b1553923d4115e580ca53ad863323880f291c7ba116687bea7a8a5a9972\": rpc error: code = NotFound desc = could not find container \"49720b1553923d4115e580ca53ad863323880f291c7ba116687bea7a8a5a9972\": container with ID starting with 49720b1553923d4115e580ca53ad863323880f291c7ba116687bea7a8a5a9972 not found: ID does not exist" Dec 15 13:58:18 crc kubenswrapper[4794]: I1215 13:58:18.022494 4794 scope.go:117] "RemoveContainer" containerID="fecdc119653ca9fca0ed5c178aee79821b6479eb5a1d78203505dcaefd165307" Dec 15 13:58:18 crc kubenswrapper[4794]: E1215 13:58:18.022737 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fecdc119653ca9fca0ed5c178aee79821b6479eb5a1d78203505dcaefd165307\": container with ID starting with fecdc119653ca9fca0ed5c178aee79821b6479eb5a1d78203505dcaefd165307 not found: ID does not exist" containerID="fecdc119653ca9fca0ed5c178aee79821b6479eb5a1d78203505dcaefd165307" Dec 15 13:58:18 crc kubenswrapper[4794]: I1215 13:58:18.022760 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fecdc119653ca9fca0ed5c178aee79821b6479eb5a1d78203505dcaefd165307"} err="failed to get container status \"fecdc119653ca9fca0ed5c178aee79821b6479eb5a1d78203505dcaefd165307\": rpc error: code = NotFound desc = could not find container \"fecdc119653ca9fca0ed5c178aee79821b6479eb5a1d78203505dcaefd165307\": container with ID starting with fecdc119653ca9fca0ed5c178aee79821b6479eb5a1d78203505dcaefd165307 not found: ID does not exist" Dec 15 13:58:18 crc kubenswrapper[4794]: I1215 13:58:18.131839 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:58:18 crc kubenswrapper[4794]: I1215 13:58:18.172662 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:58:18 crc kubenswrapper[4794]: I1215 13:58:18.746392 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" path="/var/lib/kubelet/pods/44b3d3fc-c9d0-4487-b176-26ef1f32de15/volumes" Dec 15 13:58:20 crc kubenswrapper[4794]: I1215 13:58:20.975981 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwlx2"] Dec 15 13:58:20 crc kubenswrapper[4794]: I1215 13:58:20.977105 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zwlx2" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerName="registry-server" containerID="cri-o://9d156abbcb34e6984e15897bad5368a46adced1dbfc945560aeb3693ed3270a0" gracePeriod=2 Dec 15 13:58:22 crc kubenswrapper[4794]: I1215 13:58:22.882259 4794 generic.go:334] "Generic (PLEG): container finished" podID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerID="9d156abbcb34e6984e15897bad5368a46adced1dbfc945560aeb3693ed3270a0" exitCode=0 Dec 15 13:58:22 crc kubenswrapper[4794]: I1215 13:58:22.882470 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwlx2" event={"ID":"480d7d96-d156-4f7c-8db0-11a23c26149b","Type":"ContainerDied","Data":"9d156abbcb34e6984e15897bad5368a46adced1dbfc945560aeb3693ed3270a0"} Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.311749 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.455024 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-catalog-content\") pod \"480d7d96-d156-4f7c-8db0-11a23c26149b\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.455113 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzqnr\" (UniqueName: \"kubernetes.io/projected/480d7d96-d156-4f7c-8db0-11a23c26149b-kube-api-access-nzqnr\") pod \"480d7d96-d156-4f7c-8db0-11a23c26149b\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.455165 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-utilities\") pod \"480d7d96-d156-4f7c-8db0-11a23c26149b\" (UID: \"480d7d96-d156-4f7c-8db0-11a23c26149b\") " Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.456046 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-utilities" (OuterVolumeSpecName: "utilities") pod "480d7d96-d156-4f7c-8db0-11a23c26149b" (UID: "480d7d96-d156-4f7c-8db0-11a23c26149b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.462056 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480d7d96-d156-4f7c-8db0-11a23c26149b-kube-api-access-nzqnr" (OuterVolumeSpecName: "kube-api-access-nzqnr") pod "480d7d96-d156-4f7c-8db0-11a23c26149b" (UID: "480d7d96-d156-4f7c-8db0-11a23c26149b"). InnerVolumeSpecName "kube-api-access-nzqnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.556713 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzqnr\" (UniqueName: \"kubernetes.io/projected/480d7d96-d156-4f7c-8db0-11a23c26149b-kube-api-access-nzqnr\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.556740 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.559082 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "480d7d96-d156-4f7c-8db0-11a23c26149b" (UID: "480d7d96-d156-4f7c-8db0-11a23c26149b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.657755 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480d7d96-d156-4f7c-8db0-11a23c26149b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.891458 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwlx2" event={"ID":"480d7d96-d156-4f7c-8db0-11a23c26149b","Type":"ContainerDied","Data":"e7ca91624d52c893a6cf62d7edfe8dab80eb9235b7ead17dbdb24cf7069a469f"} Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.891533 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwlx2" Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.891975 4794 scope.go:117] "RemoveContainer" containerID="9d156abbcb34e6984e15897bad5368a46adced1dbfc945560aeb3693ed3270a0" Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.917263 4794 scope.go:117] "RemoveContainer" containerID="0a9197a1e1f19beedfade9c2303382068428698d457bd677fd7ac027c53ba183" Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.936831 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwlx2"] Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.944961 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zwlx2"] Dec 15 13:58:23 crc kubenswrapper[4794]: I1215 13:58:23.952133 4794 scope.go:117] "RemoveContainer" containerID="509f98ad9fb2d82916f8bdcf7c893ca5f21beb05a75a7adba2cef3f871eb4c5b" Dec 15 13:58:24 crc kubenswrapper[4794]: I1215 13:58:24.744041 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" path="/var/lib/kubelet/pods/480d7d96-d156-4f7c-8db0-11a23c26149b/volumes" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.048651 4794 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.048858 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerName="extract-content" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.048875 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerName="extract-content" Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.048884 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerName="registry-server" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.048890 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerName="registry-server" Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.048900 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerName="registry-server" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.048905 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerName="registry-server" Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.048917 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerName="extract-utilities" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.048923 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerName="extract-utilities" Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.048931 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerName="extract-utilities" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.048937 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerName="extract-utilities" Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.048944 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerName="extract-content" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.048949 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerName="extract-content" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.049027 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="480d7d96-d156-4f7c-8db0-11a23c26149b" containerName="registry-server" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.049037 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b3d3fc-c9d0-4487-b176-26ef1f32de15" containerName="registry-server" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.049300 4794 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.049500 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f" gracePeriod=15 Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.049658 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.049812 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02" gracePeriod=15 Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.049872 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e" gracePeriod=15 Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.049935 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28" gracePeriod=15 Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.049937 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37" gracePeriod=15 Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.053349 4794 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.054328 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.055005 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.055121 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.055200 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.055278 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.055374 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.055453 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.055527 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.055624 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.055744 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.055865 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.055952 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.056195 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.056289 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.056368 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.056446 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.056517 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.081944 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.178901 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.179020 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.179091 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.179119 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.179256 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.179370 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.179406 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.179455 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280193 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280244 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280264 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280298 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280313 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280330 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280335 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280361 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280365 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280369 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280415 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280500 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280601 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280657 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280689 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.280745 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.347517 4794 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.347566 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.381163 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:58:25 crc kubenswrapper[4794]: W1215 13:58:25.396786 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-e65146642fbd6bcebfbe42ab87ee57eee1b56917586b6df3fcf36b49370cf22c WatchSource:0}: Error finding container e65146642fbd6bcebfbe42ab87ee57eee1b56917586b6df3fcf36b49370cf22c: Status 404 returned error can't find the container with id e65146642fbd6bcebfbe42ab87ee57eee1b56917586b6df3fcf36b49370cf22c Dec 15 13:58:25 crc kubenswrapper[4794]: E1215 13:58:25.399463 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18816834eb248e74 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-15 13:58:25.398574708 +0000 UTC m=+267.250597146,LastTimestamp:2025-12-15 13:58:25.398574708 +0000 UTC m=+267.250597146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.908987 4794 generic.go:334] "Generic (PLEG): container finished" podID="45fd189c-2cde-48f0-8b4c-364e646fd81b" containerID="cca4d5b5563adf0d6835dd138fcbc85b218d2dc660b31e70d389e73a3fade8e0" exitCode=0 Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.909108 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45fd189c-2cde-48f0-8b4c-364e646fd81b","Type":"ContainerDied","Data":"cca4d5b5563adf0d6835dd138fcbc85b218d2dc660b31e70d389e73a3fade8e0"} Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.909960 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.910371 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.910908 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.913757 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.914716 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02" exitCode=0 Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.914753 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28" exitCode=0 Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.914771 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e" exitCode=0 Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.914789 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37" exitCode=2 Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.916426 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4"} Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.916455 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e65146642fbd6bcebfbe42ab87ee57eee1b56917586b6df3fcf36b49370cf22c"} Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.917094 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.918104 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:25 crc kubenswrapper[4794]: I1215 13:58:25.918620 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:26 crc kubenswrapper[4794]: E1215 13:58:26.302108 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18816834eb248e74 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-15 13:58:25.398574708 +0000 UTC m=+267.250597146,LastTimestamp:2025-12-15 13:58:25.398574708 +0000 UTC m=+267.250597146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.378216 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.379161 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.379704 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.380006 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.380388 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.380487 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.380958 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.381524 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.381951 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508430 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45fd189c-2cde-48f0-8b4c-364e646fd81b-kube-api-access\") pod \"45fd189c-2cde-48f0-8b4c-364e646fd81b\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508521 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508549 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-kubelet-dir\") pod \"45fd189c-2cde-48f0-8b4c-364e646fd81b\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508659 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-var-lock\") pod \"45fd189c-2cde-48f0-8b4c-364e646fd81b\" (UID: \"45fd189c-2cde-48f0-8b4c-364e646fd81b\") " Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508685 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508677 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508739 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508714 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "45fd189c-2cde-48f0-8b4c-364e646fd81b" (UID: "45fd189c-2cde-48f0-8b4c-364e646fd81b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508744 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-var-lock" (OuterVolumeSpecName: "var-lock") pod "45fd189c-2cde-48f0-8b4c-364e646fd81b" (UID: "45fd189c-2cde-48f0-8b4c-364e646fd81b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508873 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508985 4794 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.508999 4794 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.509007 4794 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.509014 4794 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/45fd189c-2cde-48f0-8b4c-364e646fd81b-var-lock\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.509022 4794 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.515840 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45fd189c-2cde-48f0-8b4c-364e646fd81b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "45fd189c-2cde-48f0-8b4c-364e646fd81b" (UID: "45fd189c-2cde-48f0-8b4c-364e646fd81b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.610301 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45fd189c-2cde-48f0-8b4c-364e646fd81b-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.935096 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"45fd189c-2cde-48f0-8b4c-364e646fd81b","Type":"ContainerDied","Data":"88645fb7f59d0103425f135abd17a12fb1612e8769f2f2ec76723e45d0b2581a"} Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.935166 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88645fb7f59d0103425f135abd17a12fb1612e8769f2f2ec76723e45d0b2581a" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.935209 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.940003 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.941181 4794 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f" exitCode=0 Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.941268 4794 scope.go:117] "RemoveContainer" containerID="26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.941296 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.959717 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.960257 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.960713 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.973726 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.974143 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.974401 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.974847 4794 scope.go:117] "RemoveContainer" containerID="c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28" Dec 15 13:58:27 crc kubenswrapper[4794]: I1215 13:58:27.999189 4794 scope.go:117] "RemoveContainer" containerID="c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.024252 4794 scope.go:117] "RemoveContainer" containerID="5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.063184 4794 scope.go:117] "RemoveContainer" containerID="2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.093493 4794 scope.go:117] "RemoveContainer" containerID="c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.123217 4794 scope.go:117] "RemoveContainer" containerID="26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02" Dec 15 13:58:28 crc kubenswrapper[4794]: E1215 13:58:28.123750 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\": container with ID starting with 26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02 not found: ID does not exist" containerID="26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.123794 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02"} err="failed to get container status \"26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\": rpc error: code = NotFound desc = could not find container \"26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02\": container with ID starting with 26cec34db009daa14075b25813bd5d3159de0c37d0625e747a077a905b8b7d02 not found: ID does not exist" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.123837 4794 scope.go:117] "RemoveContainer" containerID="c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28" Dec 15 13:58:28 crc kubenswrapper[4794]: E1215 13:58:28.124197 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\": container with ID starting with c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28 not found: ID does not exist" containerID="c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.124290 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28"} err="failed to get container status \"c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\": rpc error: code = NotFound desc = could not find container \"c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28\": container with ID starting with c082fd23082194312edcd7485fdedb06d1b6771a2711dfaec71c88f6ee40db28 not found: ID does not exist" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.124361 4794 scope.go:117] "RemoveContainer" containerID="c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e" Dec 15 13:58:28 crc kubenswrapper[4794]: E1215 13:58:28.124784 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\": container with ID starting with c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e not found: ID does not exist" containerID="c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.124904 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e"} err="failed to get container status \"c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\": rpc error: code = NotFound desc = could not find container \"c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e\": container with ID starting with c5b178323c82e5486229a606e7336ce70a047404c1be5703f11cd3818d14790e not found: ID does not exist" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.125013 4794 scope.go:117] "RemoveContainer" containerID="5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37" Dec 15 13:58:28 crc kubenswrapper[4794]: E1215 13:58:28.125728 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\": container with ID starting with 5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37 not found: ID does not exist" containerID="5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.125759 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37"} err="failed to get container status \"5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\": rpc error: code = NotFound desc = could not find container \"5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37\": container with ID starting with 5e9cc96cba7b4db396fd28b37fdd83183007860d4dc07d7e1d03b99d3aacde37 not found: ID does not exist" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.125782 4794 scope.go:117] "RemoveContainer" containerID="2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f" Dec 15 13:58:28 crc kubenswrapper[4794]: E1215 13:58:28.126010 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\": container with ID starting with 2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f not found: ID does not exist" containerID="2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.126092 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f"} err="failed to get container status \"2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\": rpc error: code = NotFound desc = could not find container \"2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f\": container with ID starting with 2e30fdaf3bd0f52106a0c8421f7bc58799dbc7ef7c83b2a901e6e1394865d11f not found: ID does not exist" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.126164 4794 scope.go:117] "RemoveContainer" containerID="c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042" Dec 15 13:58:28 crc kubenswrapper[4794]: E1215 13:58:28.126789 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\": container with ID starting with c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042 not found: ID does not exist" containerID="c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.126887 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042"} err="failed to get container status \"c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\": rpc error: code = NotFound desc = could not find container \"c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042\": container with ID starting with c015b444fc6443048f559d54826ffdbc3bc91b70ca507654285ef5f2f208e042 not found: ID does not exist" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.743022 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.743849 4794 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.744156 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.744396 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 15 13:58:28 crc kubenswrapper[4794]: E1215 13:58:28.762864 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:58:28 crc kubenswrapper[4794]: E1215 13:58:28.774235 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 15 13:58:28 crc kubenswrapper[4794]: E1215 13:58:28.776144 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.826565 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.826641 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.827376 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:58:28 crc kubenswrapper[4794]: I1215 13:58:28.836514 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:58:29 crc kubenswrapper[4794]: I1215 13:58:29.130818 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:58:29 crc kubenswrapper[4794]: I1215 13:58:29.130916 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:58:29 crc kubenswrapper[4794]: I1215 13:58:29.135854 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:58:29 crc kubenswrapper[4794]: I1215 13:58:29.137002 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:58:29 crc kubenswrapper[4794]: E1215 13:58:29.760386 4794 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:29 crc kubenswrapper[4794]: E1215 13:58:29.761096 4794 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:29 crc kubenswrapper[4794]: E1215 13:58:29.761819 4794 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:29 crc kubenswrapper[4794]: E1215 13:58:29.762350 4794 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:29 crc kubenswrapper[4794]: E1215 13:58:29.762760 4794 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:29 crc kubenswrapper[4794]: I1215 13:58:29.762809 4794 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 15 13:58:29 crc kubenswrapper[4794]: E1215 13:58:29.763333 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Dec 15 13:58:29 crc kubenswrapper[4794]: E1215 13:58:29.964525 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Dec 15 13:58:30 crc kubenswrapper[4794]: E1215 13:58:30.365193 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Dec 15 13:58:31 crc kubenswrapper[4794]: E1215 13:58:31.167187 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Dec 15 13:58:32 crc kubenswrapper[4794]: E1215 13:58:32.768095 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Dec 15 13:58:35 crc kubenswrapper[4794]: E1215 13:58:35.969208 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="6.4s" Dec 15 13:58:36 crc kubenswrapper[4794]: E1215 13:58:36.303535 4794 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18816834eb248e74 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-15 13:58:25.398574708 +0000 UTC m=+267.250597146,LastTimestamp:2025-12-15 13:58:25.398574708 +0000 UTC m=+267.250597146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 15 13:58:37 crc kubenswrapper[4794]: E1215 13:58:37.036615 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:58:37Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:58:37Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:58:37Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-15T13:58:37Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:37 crc kubenswrapper[4794]: E1215 13:58:37.037813 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:37 crc kubenswrapper[4794]: E1215 13:58:37.038549 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:37 crc kubenswrapper[4794]: E1215 13:58:37.039520 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:37 crc kubenswrapper[4794]: E1215 13:58:37.040061 4794 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:37 crc kubenswrapper[4794]: E1215 13:58:37.040182 4794 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 15 13:58:38 crc kubenswrapper[4794]: I1215 13:58:38.742782 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:38 crc kubenswrapper[4794]: I1215 13:58:38.743944 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:40 crc kubenswrapper[4794]: I1215 13:58:40.736725 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:40 crc kubenswrapper[4794]: I1215 13:58:40.737981 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:40 crc kubenswrapper[4794]: I1215 13:58:40.738645 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:40 crc kubenswrapper[4794]: I1215 13:58:40.758390 4794 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fbcc318-495e-462d-9990-8688e8ca6584" Dec 15 13:58:40 crc kubenswrapper[4794]: I1215 13:58:40.758430 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fbcc318-495e-462d-9990-8688e8ca6584" Dec 15 13:58:40 crc kubenswrapper[4794]: E1215 13:58:40.758948 4794 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:40 crc kubenswrapper[4794]: I1215 13:58:40.759458 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.028053 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.028401 4794 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9" exitCode=1 Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.028456 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9"} Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.029407 4794 scope.go:117] "RemoveContainer" containerID="4a37d4a6a0256dc60f9745572a12dc060e5b25ff4d957c937452933dae8a43c9" Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.029939 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.030701 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.031156 4794 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.032553 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f29a831e837aedd752fbc08c52986f3dcbb023fd57b7d369fea84bca30fcd672"} Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.736521 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.736549 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.737452 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.737615 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:58:41 crc kubenswrapper[4794]: I1215 13:58:41.750602 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" podUID="28dac958-62ff-4d38-9bf6-a86fa57fb772" containerName="oauth-openshift" containerID="cri-o://8d3ced1f5a7834f1463053efec1c8c470aefae73bd2d692bbf92a64e6f06d84e" gracePeriod=15 Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.053076 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.053161 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"07a559dea2155f4bfdd46a28034e41f94772ed97b416289306bbc16c27fa4e6f"} Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.054487 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.058799 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.059137 4794 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.059510 4794 generic.go:334] "Generic (PLEG): container finished" podID="28dac958-62ff-4d38-9bf6-a86fa57fb772" containerID="8d3ced1f5a7834f1463053efec1c8c470aefae73bd2d692bbf92a64e6f06d84e" exitCode=0 Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.059621 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" event={"ID":"28dac958-62ff-4d38-9bf6-a86fa57fb772","Type":"ContainerDied","Data":"8d3ced1f5a7834f1463053efec1c8c470aefae73bd2d692bbf92a64e6f06d84e"} Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.069454 4794 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="27ad92a82c73f49bef4034a04ac91d73fae6839486df1f104e4878195d5fc758" exitCode=0 Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.069516 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"27ad92a82c73f49bef4034a04ac91d73fae6839486df1f104e4878195d5fc758"} Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.069914 4794 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fbcc318-495e-462d-9990-8688e8ca6584" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.069936 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fbcc318-495e-462d-9990-8688e8ca6584" Dec 15 13:58:42 crc kubenswrapper[4794]: E1215 13:58:42.070436 4794 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.072268 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.072551 4794 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.073353 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.345547 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.346411 4794 status_manager.go:851] "Failed to get status for pod" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.346886 4794 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.347321 4794 status_manager.go:851] "Failed to get status for pod" podUID="28dac958-62ff-4d38-9bf6-a86fa57fb772" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-2rt5j\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.347562 4794 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 15 13:58:42 crc kubenswrapper[4794]: E1215 13:58:42.370370 4794 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="7s" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527056 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-serving-cert\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527109 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-router-certs\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527140 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-dir\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527178 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbmqh\" (UniqueName: \"kubernetes.io/projected/28dac958-62ff-4d38-9bf6-a86fa57fb772-kube-api-access-nbmqh\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527200 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-session\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527225 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-provider-selection\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527292 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-policies\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527316 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-login\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527346 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-ocp-branding-template\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527370 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-trusted-ca-bundle\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527392 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-cliconfig\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527454 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-idp-0-file-data\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527489 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-service-ca\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.527521 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-error\") pod \"28dac958-62ff-4d38-9bf6-a86fa57fb772\" (UID: \"28dac958-62ff-4d38-9bf6-a86fa57fb772\") " Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.528339 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.530671 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.530772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.531124 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.531356 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.534864 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.535178 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28dac958-62ff-4d38-9bf6-a86fa57fb772-kube-api-access-nbmqh" (OuterVolumeSpecName: "kube-api-access-nbmqh") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "kube-api-access-nbmqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.535774 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.535897 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.536023 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.536213 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.537363 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.540904 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.559095 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "28dac958-62ff-4d38-9bf6-a86fa57fb772" (UID: "28dac958-62ff-4d38-9bf6-a86fa57fb772"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:58:42 crc kubenswrapper[4794]: E1215 13:58:42.597125 4794 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 15 13:58:42 crc kubenswrapper[4794]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b" Netns:"/var/run/netns/d2d97ad5-e5a3-47ce-847d-ff5af8873c94" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:58:42 crc kubenswrapper[4794]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 15 13:58:42 crc kubenswrapper[4794]: > Dec 15 13:58:42 crc kubenswrapper[4794]: E1215 13:58:42.597209 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 15 13:58:42 crc kubenswrapper[4794]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b" Netns:"/var/run/netns/d2d97ad5-e5a3-47ce-847d-ff5af8873c94" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:58:42 crc kubenswrapper[4794]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 15 13:58:42 crc kubenswrapper[4794]: > pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:58:42 crc kubenswrapper[4794]: E1215 13:58:42.597238 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 15 13:58:42 crc kubenswrapper[4794]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b" Netns:"/var/run/netns/d2d97ad5-e5a3-47ce-847d-ff5af8873c94" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:58:42 crc kubenswrapper[4794]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 15 13:58:42 crc kubenswrapper[4794]: > pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:58:42 crc kubenswrapper[4794]: E1215 13:58:42.597301 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"networking-console-plugin-85b44fc459-gdk6g_openshift-network-console(5fe485a1-e14f-4c09-b5b9-f252bc42b7e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"networking-console-plugin-85b44fc459-gdk6g_openshift-network-console(5fe485a1-e14f-4c09-b5b9-f252bc42b7e8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-85b44fc459-gdk6g_openshift-network-console_5fe485a1-e14f-4c09-b5b9-f252bc42b7e8_0(28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b): error adding pod openshift-network-console_networking-console-plugin-85b44fc459-gdk6g to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b\\\" Netns:\\\"/var/run/netns/d2d97ad5-e5a3-47ce-847d-ff5af8873c94\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-85b44fc459-gdk6g;K8S_POD_INFRA_CONTAINER_ID=28a5affce7af6cfdd79d875e26b84eac578e74b05273d1f12abae729bdcaba5b;K8S_POD_UID=5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] networking: Multus: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-85b44fc459-gdk6g in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g?timeout=1m0s\\\": dial tcp 38.102.83.151:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628585 4794 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628664 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628688 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628705 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628723 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628739 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628756 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628771 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628786 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628801 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628818 4794 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28dac958-62ff-4d38-9bf6-a86fa57fb772-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628834 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628850 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbmqh\" (UniqueName: \"kubernetes.io/projected/28dac958-62ff-4d38-9bf6-a86fa57fb772-kube-api-access-nbmqh\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.628866 4794 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28dac958-62ff-4d38-9bf6-a86fa57fb772-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 15 13:58:42 crc kubenswrapper[4794]: E1215 13:58:42.694638 4794 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 15 13:58:42 crc kubenswrapper[4794]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-source-55646444c4-trplf_openshift-network-diagnostics_9d751cbb-f2e2-430d-9754-c882a5e924a5_0(100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b): error adding pod openshift-network-diagnostics_network-check-source-55646444c4-trplf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b" Netns:"/var/run/netns/8c0d314a-e902-4bdb-bf1f-8e4e05291705" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-source-55646444c4-trplf;K8S_POD_INFRA_CONTAINER_ID=100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b;K8S_POD_UID=9d751cbb-f2e2-430d-9754-c882a5e924a5" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-source-55646444c4-trplf] networking: Multus: [openshift-network-diagnostics/network-check-source-55646444c4-trplf/9d751cbb-f2e2-430d-9754-c882a5e924a5]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf?timeout=1m0s": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:58:42 crc kubenswrapper[4794]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 15 13:58:42 crc kubenswrapper[4794]: > Dec 15 13:58:42 crc kubenswrapper[4794]: E1215 13:58:42.694700 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 15 13:58:42 crc kubenswrapper[4794]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-source-55646444c4-trplf_openshift-network-diagnostics_9d751cbb-f2e2-430d-9754-c882a5e924a5_0(100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b): error adding pod openshift-network-diagnostics_network-check-source-55646444c4-trplf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b" Netns:"/var/run/netns/8c0d314a-e902-4bdb-bf1f-8e4e05291705" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-source-55646444c4-trplf;K8S_POD_INFRA_CONTAINER_ID=100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b;K8S_POD_UID=9d751cbb-f2e2-430d-9754-c882a5e924a5" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-source-55646444c4-trplf] networking: Multus: [openshift-network-diagnostics/network-check-source-55646444c4-trplf/9d751cbb-f2e2-430d-9754-c882a5e924a5]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf?timeout=1m0s": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:58:42 crc kubenswrapper[4794]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 15 13:58:42 crc kubenswrapper[4794]: > pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:58:42 crc kubenswrapper[4794]: E1215 13:58:42.694718 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 15 13:58:42 crc kubenswrapper[4794]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-source-55646444c4-trplf_openshift-network-diagnostics_9d751cbb-f2e2-430d-9754-c882a5e924a5_0(100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b): error adding pod openshift-network-diagnostics_network-check-source-55646444c4-trplf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b" Netns:"/var/run/netns/8c0d314a-e902-4bdb-bf1f-8e4e05291705" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-source-55646444c4-trplf;K8S_POD_INFRA_CONTAINER_ID=100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b;K8S_POD_UID=9d751cbb-f2e2-430d-9754-c882a5e924a5" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-source-55646444c4-trplf] networking: Multus: [openshift-network-diagnostics/network-check-source-55646444c4-trplf/9d751cbb-f2e2-430d-9754-c882a5e924a5]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf?timeout=1m0s": dial tcp 38.102.83.151:6443: connect: connection refused Dec 15 13:58:42 crc kubenswrapper[4794]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 15 13:58:42 crc kubenswrapper[4794]: > pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:58:42 crc kubenswrapper[4794]: E1215 13:58:42.694770 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-source-55646444c4-trplf_openshift-network-diagnostics_9d751cbb-f2e2-430d-9754-c882a5e924a5_0(100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b): error adding pod openshift-network-diagnostics_network-check-source-55646444c4-trplf to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b\\\" Netns:\\\"/var/run/netns/8c0d314a-e902-4bdb-bf1f-8e4e05291705\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-source-55646444c4-trplf;K8S_POD_INFRA_CONTAINER_ID=100eea91c853ab72c78e63ebb5ba58a3b89a4c89c2058fa18fcc5eba758f5b3b;K8S_POD_UID=9d751cbb-f2e2-430d-9754-c882a5e924a5\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-source-55646444c4-trplf] networking: Multus: [openshift-network-diagnostics/network-check-source-55646444c4-trplf/9d751cbb-f2e2-430d-9754-c882a5e924a5]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: SetNetworkStatus: failed to update the pod network-check-source-55646444c4-trplf in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-source-55646444c4-trplf?timeout=1m0s\\\": dial tcp 38.102.83.151:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.738054 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:58:42 crc kubenswrapper[4794]: I1215 13:58:42.738489 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:58:43 crc kubenswrapper[4794]: I1215 13:58:43.078240 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" event={"ID":"28dac958-62ff-4d38-9bf6-a86fa57fb772","Type":"ContainerDied","Data":"fa06418e6deb67921ed36962cab7f275c5525df2bdbcfad8ac70c0961a29eca0"} Dec 15 13:58:43 crc kubenswrapper[4794]: I1215 13:58:43.078287 4794 scope.go:117] "RemoveContainer" containerID="8d3ced1f5a7834f1463053efec1c8c470aefae73bd2d692bbf92a64e6f06d84e" Dec 15 13:58:43 crc kubenswrapper[4794]: I1215 13:58:43.078290 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2rt5j" Dec 15 13:58:43 crc kubenswrapper[4794]: I1215 13:58:43.084661 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3b37ee78bb7f2ad12dd38c32261bea005505cf07fb801e25a4ddd031b590166c"} Dec 15 13:58:43 crc kubenswrapper[4794]: I1215 13:58:43.084705 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1c6697819510e6f1ab313cd2d89b6e517640fae62be1e952b1ea80aaa02f333e"} Dec 15 13:58:43 crc kubenswrapper[4794]: I1215 13:58:43.084718 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1e99af102d43ac0ea9d2806ac30c1295c84ca0fc603effb6e1a83a32b1414a0c"} Dec 15 13:58:43 crc kubenswrapper[4794]: I1215 13:58:43.084729 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4335c2b080e10b94c80a5d0c4ddeba7440175324e5376df5eecdf2435eb58950"} Dec 15 13:58:44 crc kubenswrapper[4794]: I1215 13:58:44.092669 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3393dd70d09980e4ba58a4e70c9beaa0e8672f71bfff120f554fb9cc71776668"} Dec 15 13:58:44 crc kubenswrapper[4794]: I1215 13:58:44.092955 4794 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fbcc318-495e-462d-9990-8688e8ca6584" Dec 15 13:58:44 crc kubenswrapper[4794]: I1215 13:58:44.092982 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fbcc318-495e-462d-9990-8688e8ca6584" Dec 15 13:58:44 crc kubenswrapper[4794]: I1215 13:58:44.092985 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:45 crc kubenswrapper[4794]: I1215 13:58:45.760460 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:45 crc kubenswrapper[4794]: I1215 13:58:45.760542 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:45 crc kubenswrapper[4794]: I1215 13:58:45.770157 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:48 crc kubenswrapper[4794]: I1215 13:58:48.938975 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:58:49 crc kubenswrapper[4794]: I1215 13:58:49.053192 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:58:49 crc kubenswrapper[4794]: I1215 13:58:49.058098 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:58:49 crc kubenswrapper[4794]: I1215 13:58:49.104377 4794 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:49 crc kubenswrapper[4794]: I1215 13:58:49.119846 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"629fdc883f66dc81fadea448f5530c654adca91e8f7e79100b4e942aec4b2551"} Dec 15 13:58:49 crc kubenswrapper[4794]: I1215 13:58:49.119881 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d302a2bfbc3e5814eae9a258f9610694a5a29980447a8ae4c04449d1b5c7d5a9"} Dec 15 13:58:49 crc kubenswrapper[4794]: I1215 13:58:49.120308 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:58:50 crc kubenswrapper[4794]: I1215 13:58:50.123896 4794 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fbcc318-495e-462d-9990-8688e8ca6584" Dec 15 13:58:50 crc kubenswrapper[4794]: I1215 13:58:50.123933 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fbcc318-495e-462d-9990-8688e8ca6584" Dec 15 13:58:50 crc kubenswrapper[4794]: I1215 13:58:50.129334 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:58:50 crc kubenswrapper[4794]: I1215 13:58:50.131396 4794 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="42a717b3-abd4-4872-9ef7-ca1a81b07ca0" Dec 15 13:58:51 crc kubenswrapper[4794]: I1215 13:58:51.128762 4794 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fbcc318-495e-462d-9990-8688e8ca6584" Dec 15 13:58:51 crc kubenswrapper[4794]: I1215 13:58:51.129103 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1fbcc318-495e-462d-9990-8688e8ca6584" Dec 15 13:58:54 crc kubenswrapper[4794]: I1215 13:58:54.737206 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:58:54 crc kubenswrapper[4794]: I1215 13:58:54.738093 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 15 13:58:55 crc kubenswrapper[4794]: W1215 13:58:55.193556 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-0ef5fe37202e24323cfb80fa481711065c2ce07f02d5f78aab1ff7114afae919 WatchSource:0}: Error finding container 0ef5fe37202e24323cfb80fa481711065c2ce07f02d5f78aab1ff7114afae919: Status 404 returned error can't find the container with id 0ef5fe37202e24323cfb80fa481711065c2ce07f02d5f78aab1ff7114afae919 Dec 15 13:58:56 crc kubenswrapper[4794]: I1215 13:58:56.168903 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Dec 15 13:58:56 crc kubenswrapper[4794]: I1215 13:58:56.168985 4794 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="1b80311eb43c0315ab8121ee788db96f912fdd92ccfadd35108ae01610abbed6" exitCode=255 Dec 15 13:58:56 crc kubenswrapper[4794]: I1215 13:58:56.169027 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"1b80311eb43c0315ab8121ee788db96f912fdd92ccfadd35108ae01610abbed6"} Dec 15 13:58:56 crc kubenswrapper[4794]: I1215 13:58:56.169064 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0ef5fe37202e24323cfb80fa481711065c2ce07f02d5f78aab1ff7114afae919"} Dec 15 13:58:56 crc kubenswrapper[4794]: I1215 13:58:56.169671 4794 scope.go:117] "RemoveContainer" containerID="1b80311eb43c0315ab8121ee788db96f912fdd92ccfadd35108ae01610abbed6" Dec 15 13:58:56 crc kubenswrapper[4794]: I1215 13:58:56.736548 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:58:56 crc kubenswrapper[4794]: I1215 13:58:56.737402 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 15 13:58:57 crc kubenswrapper[4794]: I1215 13:58:57.177062 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Dec 15 13:58:57 crc kubenswrapper[4794]: I1215 13:58:57.178210 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Dec 15 13:58:57 crc kubenswrapper[4794]: I1215 13:58:57.178268 4794 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="99ab443a6f31b183b3aa0212c01fccef7cc24ea6bd714af270eb1084b27541de" exitCode=255 Dec 15 13:58:57 crc kubenswrapper[4794]: I1215 13:58:57.178308 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"99ab443a6f31b183b3aa0212c01fccef7cc24ea6bd714af270eb1084b27541de"} Dec 15 13:58:57 crc kubenswrapper[4794]: I1215 13:58:57.178357 4794 scope.go:117] "RemoveContainer" containerID="1b80311eb43c0315ab8121ee788db96f912fdd92ccfadd35108ae01610abbed6" Dec 15 13:58:57 crc kubenswrapper[4794]: I1215 13:58:57.179098 4794 scope.go:117] "RemoveContainer" containerID="99ab443a6f31b183b3aa0212c01fccef7cc24ea6bd714af270eb1084b27541de" Dec 15 13:58:57 crc kubenswrapper[4794]: E1215 13:58:57.179912 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 15 13:58:57 crc kubenswrapper[4794]: W1215 13:58:57.204717 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-a42aa95ad3f4db39dd8d9ca6975d425912cc51375fd9a2312ca227003255d284 WatchSource:0}: Error finding container a42aa95ad3f4db39dd8d9ca6975d425912cc51375fd9a2312ca227003255d284: Status 404 returned error can't find the container with id a42aa95ad3f4db39dd8d9ca6975d425912cc51375fd9a2312ca227003255d284 Dec 15 13:58:58 crc kubenswrapper[4794]: I1215 13:58:58.185393 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Dec 15 13:58:58 crc kubenswrapper[4794]: I1215 13:58:58.188106 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3b20699ce5def76aac3c9d9b58e5abe3839e27c3aff321fc88ede0fe39794b1d"} Dec 15 13:58:58 crc kubenswrapper[4794]: I1215 13:58:58.188228 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a42aa95ad3f4db39dd8d9ca6975d425912cc51375fd9a2312ca227003255d284"} Dec 15 13:58:58 crc kubenswrapper[4794]: I1215 13:58:58.576577 4794 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 15 13:58:58 crc kubenswrapper[4794]: I1215 13:58:58.770687 4794 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="42a717b3-abd4-4872-9ef7-ca1a81b07ca0" Dec 15 13:58:58 crc kubenswrapper[4794]: I1215 13:58:58.950102 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 15 13:58:59 crc kubenswrapper[4794]: I1215 13:58:59.376882 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 15 13:58:59 crc kubenswrapper[4794]: I1215 13:58:59.447600 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 15 13:58:59 crc kubenswrapper[4794]: I1215 13:58:59.711954 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 15 13:59:00 crc kubenswrapper[4794]: I1215 13:59:00.027748 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 15 13:59:00 crc kubenswrapper[4794]: I1215 13:59:00.271561 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 15 13:59:00 crc kubenswrapper[4794]: I1215 13:59:00.329163 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 15 13:59:00 crc kubenswrapper[4794]: I1215 13:59:00.792350 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 15 13:59:00 crc kubenswrapper[4794]: I1215 13:59:00.854881 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 15 13:59:00 crc kubenswrapper[4794]: I1215 13:59:00.855964 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 15 13:59:00 crc kubenswrapper[4794]: I1215 13:59:00.953337 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 15 13:59:01 crc kubenswrapper[4794]: I1215 13:59:01.135227 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 15 13:59:01 crc kubenswrapper[4794]: I1215 13:59:01.223120 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 15 13:59:01 crc kubenswrapper[4794]: I1215 13:59:01.478079 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 15 13:59:01 crc kubenswrapper[4794]: I1215 13:59:01.734833 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 15 13:59:01 crc kubenswrapper[4794]: I1215 13:59:01.785608 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.035225 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.086630 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.107047 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.221193 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.239072 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.261277 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.277754 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.385085 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.387744 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.413190 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.497712 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.500120 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.571963 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.594207 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.616012 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.786096 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.788116 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.839122 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.933243 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 15 13:59:02 crc kubenswrapper[4794]: I1215 13:59:02.979914 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.003519 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.058559 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.059935 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.142378 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.168265 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.274492 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.283705 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.316440 4794 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.322852 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=38.322823121 podStartE2EDuration="38.322823121s" podCreationTimestamp="2025-12-15 13:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:58:48.957304151 +0000 UTC m=+290.809326589" watchObservedRunningTime="2025-12-15 13:59:03.322823121 +0000 UTC m=+305.174845639" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.324310 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2rt5j","openshift-kube-apiserver/kube-apiserver-crc"] Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.324398 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.329564 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.357546 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.357521901 podStartE2EDuration="14.357521901s" podCreationTimestamp="2025-12-15 13:58:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:59:03.34949807 +0000 UTC m=+305.201520518" watchObservedRunningTime="2025-12-15 13:59:03.357521901 +0000 UTC m=+305.209544339" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.393118 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.399391 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.431419 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.478338 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.498508 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.586684 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.623524 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.635266 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.681035 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.719503 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.744425 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.801192 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.862084 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.916374 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 15 13:59:03 crc kubenswrapper[4794]: I1215 13:59:03.964935 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 15 13:59:04 crc kubenswrapper[4794]: I1215 13:59:04.003440 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 15 13:59:04 crc kubenswrapper[4794]: I1215 13:59:04.099559 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 15 13:59:04 crc kubenswrapper[4794]: I1215 13:59:04.297997 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 15 13:59:04 crc kubenswrapper[4794]: I1215 13:59:04.409144 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 15 13:59:04 crc kubenswrapper[4794]: I1215 13:59:04.481557 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 15 13:59:04 crc kubenswrapper[4794]: I1215 13:59:04.621142 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 15 13:59:04 crc kubenswrapper[4794]: I1215 13:59:04.654426 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 15 13:59:04 crc kubenswrapper[4794]: I1215 13:59:04.669480 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 15 13:59:04 crc kubenswrapper[4794]: I1215 13:59:04.749188 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28dac958-62ff-4d38-9bf6-a86fa57fb772" path="/var/lib/kubelet/pods/28dac958-62ff-4d38-9bf6-a86fa57fb772/volumes" Dec 15 13:59:04 crc kubenswrapper[4794]: I1215 13:59:04.843196 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 15 13:59:04 crc kubenswrapper[4794]: I1215 13:59:04.910497 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.050714 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.120325 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.240804 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.259477 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.330975 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.342822 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.343384 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.349398 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.377139 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.485521 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.545327 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.698526 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.851451 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 15 13:59:05 crc kubenswrapper[4794]: I1215 13:59:05.867612 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.076705 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.077458 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.169050 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.173179 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.178174 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.267275 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.321642 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.361684 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.412078 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.415621 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.444330 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.454622 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.460450 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.593729 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.659055 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 15 13:59:06 crc kubenswrapper[4794]: I1215 13:59:06.954977 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.016345 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.027173 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.032223 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.107305 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.145436 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.206871 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.225013 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.248571 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.284856 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.408737 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.409777 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.419763 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.447211 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.838649 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 15 13:59:07 crc kubenswrapper[4794]: I1215 13:59:07.904317 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.007993 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.016530 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.098600 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.129915 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.262936 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.267443 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.357261 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.360405 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.471177 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.473023 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.485252 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.746630 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.761966 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.767363 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.797232 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.861311 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.882571 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.931925 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.939664 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 15 13:59:08 crc kubenswrapper[4794]: I1215 13:59:08.984465 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.070944 4794 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.307189 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8c77c5b59-hdwz6"] Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.307436 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" podUID="5ca1b031-98e5-47b0-87da-ca66cee361c3" containerName="controller-manager" containerID="cri-o://f9250d7eda278dd06d741168bd5f13fbfe1506db50bc6b74ca54d5d2eb995c95" gracePeriod=30 Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.321150 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.330499 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.354386 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.354723 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.390247 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn"] Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.390427 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" podUID="7f4ac38e-f424-493a-8c65-3002f7ed1626" containerName="route-controller-manager" containerID="cri-o://f128febb89135628d56f80402fcdfc96c099db1292e6461429306c8c0282d2d6" gracePeriod=30 Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.472956 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.494603 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.502742 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.642675 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.765769 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.766391 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.984800 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 15 13:59:09 crc kubenswrapper[4794]: I1215 13:59:09.996416 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.018898 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.203311 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.239283 4794 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.257197 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.259880 4794 generic.go:334] "Generic (PLEG): container finished" podID="5ca1b031-98e5-47b0-87da-ca66cee361c3" containerID="f9250d7eda278dd06d741168bd5f13fbfe1506db50bc6b74ca54d5d2eb995c95" exitCode=0 Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.260060 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" event={"ID":"5ca1b031-98e5-47b0-87da-ca66cee361c3","Type":"ContainerDied","Data":"f9250d7eda278dd06d741168bd5f13fbfe1506db50bc6b74ca54d5d2eb995c95"} Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.262007 4794 generic.go:334] "Generic (PLEG): container finished" podID="7f4ac38e-f424-493a-8c65-3002f7ed1626" containerID="f128febb89135628d56f80402fcdfc96c099db1292e6461429306c8c0282d2d6" exitCode=0 Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.262034 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" event={"ID":"7f4ac38e-f424-493a-8c65-3002f7ed1626","Type":"ContainerDied","Data":"f128febb89135628d56f80402fcdfc96c099db1292e6461429306c8c0282d2d6"} Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.291967 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.321631 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.322533 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.323347 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.329492 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.330386 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.382015 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.426931 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.499093 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62wtm\" (UniqueName: \"kubernetes.io/projected/5ca1b031-98e5-47b0-87da-ca66cee361c3-kube-api-access-62wtm\") pod \"5ca1b031-98e5-47b0-87da-ca66cee361c3\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.499152 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-proxy-ca-bundles\") pod \"5ca1b031-98e5-47b0-87da-ca66cee361c3\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.499224 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-config\") pod \"7f4ac38e-f424-493a-8c65-3002f7ed1626\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.499259 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-client-ca\") pod \"5ca1b031-98e5-47b0-87da-ca66cee361c3\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.499301 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ca1b031-98e5-47b0-87da-ca66cee361c3-serving-cert\") pod \"5ca1b031-98e5-47b0-87da-ca66cee361c3\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.499336 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f4ac38e-f424-493a-8c65-3002f7ed1626-serving-cert\") pod \"7f4ac38e-f424-493a-8c65-3002f7ed1626\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.499367 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-client-ca\") pod \"7f4ac38e-f424-493a-8c65-3002f7ed1626\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.499415 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx5gp\" (UniqueName: \"kubernetes.io/projected/7f4ac38e-f424-493a-8c65-3002f7ed1626-kube-api-access-bx5gp\") pod \"7f4ac38e-f424-493a-8c65-3002f7ed1626\" (UID: \"7f4ac38e-f424-493a-8c65-3002f7ed1626\") " Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.499433 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-config\") pod \"5ca1b031-98e5-47b0-87da-ca66cee361c3\" (UID: \"5ca1b031-98e5-47b0-87da-ca66cee361c3\") " Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.499918 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5ca1b031-98e5-47b0-87da-ca66cee361c3" (UID: "5ca1b031-98e5-47b0-87da-ca66cee361c3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.499994 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-config" (OuterVolumeSpecName: "config") pod "7f4ac38e-f424-493a-8c65-3002f7ed1626" (UID: "7f4ac38e-f424-493a-8c65-3002f7ed1626"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.500216 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-client-ca" (OuterVolumeSpecName: "client-ca") pod "5ca1b031-98e5-47b0-87da-ca66cee361c3" (UID: "5ca1b031-98e5-47b0-87da-ca66cee361c3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.500192 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-client-ca" (OuterVolumeSpecName: "client-ca") pod "7f4ac38e-f424-493a-8c65-3002f7ed1626" (UID: "7f4ac38e-f424-493a-8c65-3002f7ed1626"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.500543 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-config" (OuterVolumeSpecName: "config") pod "5ca1b031-98e5-47b0-87da-ca66cee361c3" (UID: "5ca1b031-98e5-47b0-87da-ca66cee361c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.504122 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca1b031-98e5-47b0-87da-ca66cee361c3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5ca1b031-98e5-47b0-87da-ca66cee361c3" (UID: "5ca1b031-98e5-47b0-87da-ca66cee361c3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.504123 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4ac38e-f424-493a-8c65-3002f7ed1626-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7f4ac38e-f424-493a-8c65-3002f7ed1626" (UID: "7f4ac38e-f424-493a-8c65-3002f7ed1626"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.504715 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f4ac38e-f424-493a-8c65-3002f7ed1626-kube-api-access-bx5gp" (OuterVolumeSpecName: "kube-api-access-bx5gp") pod "7f4ac38e-f424-493a-8c65-3002f7ed1626" (UID: "7f4ac38e-f424-493a-8c65-3002f7ed1626"). InnerVolumeSpecName "kube-api-access-bx5gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.504792 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca1b031-98e5-47b0-87da-ca66cee361c3-kube-api-access-62wtm" (OuterVolumeSpecName: "kube-api-access-62wtm") pod "5ca1b031-98e5-47b0-87da-ca66cee361c3" (UID: "5ca1b031-98e5-47b0-87da-ca66cee361c3"). InnerVolumeSpecName "kube-api-access-62wtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.591167 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z"] Dec 15 13:59:10 crc kubenswrapper[4794]: E1215 13:59:10.591448 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" containerName="installer" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.591469 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" containerName="installer" Dec 15 13:59:10 crc kubenswrapper[4794]: E1215 13:59:10.591494 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4ac38e-f424-493a-8c65-3002f7ed1626" containerName="route-controller-manager" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.591504 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4ac38e-f424-493a-8c65-3002f7ed1626" containerName="route-controller-manager" Dec 15 13:59:10 crc kubenswrapper[4794]: E1215 13:59:10.591520 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28dac958-62ff-4d38-9bf6-a86fa57fb772" containerName="oauth-openshift" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.591529 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="28dac958-62ff-4d38-9bf6-a86fa57fb772" containerName="oauth-openshift" Dec 15 13:59:10 crc kubenswrapper[4794]: E1215 13:59:10.591541 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca1b031-98e5-47b0-87da-ca66cee361c3" containerName="controller-manager" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.591550 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca1b031-98e5-47b0-87da-ca66cee361c3" containerName="controller-manager" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.591712 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca1b031-98e5-47b0-87da-ca66cee361c3" containerName="controller-manager" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.591730 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fd189c-2cde-48f0-8b4c-364e646fd81b" containerName="installer" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.591739 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4ac38e-f424-493a-8c65-3002f7ed1626" containerName="route-controller-manager" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.591752 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="28dac958-62ff-4d38-9bf6-a86fa57fb772" containerName="oauth-openshift" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.592226 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.595867 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-74795b69d5-sqxkz"] Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.596687 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.597854 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.598031 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.598659 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.599359 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.599906 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600104 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600273 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600529 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600840 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600859 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ca1b031-98e5-47b0-87da-ca66cee361c3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600881 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f4ac38e-f424-493a-8c65-3002f7ed1626-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600894 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600907 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx5gp\" (UniqueName: \"kubernetes.io/projected/7f4ac38e-f424-493a-8c65-3002f7ed1626-kube-api-access-bx5gp\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600922 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600933 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62wtm\" (UniqueName: \"kubernetes.io/projected/5ca1b031-98e5-47b0-87da-ca66cee361c3-kube-api-access-62wtm\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600945 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.600992 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f4ac38e-f424-493a-8c65-3002f7ed1626-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.601016 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ca1b031-98e5-47b0-87da-ca66cee361c3-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.601029 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.601300 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.602394 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.607987 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.611497 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.615793 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.618550 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.649975 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.682012 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.701676 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr75f\" (UniqueName: \"kubernetes.io/projected/a82015e3-1f43-4442-8064-be5152d3622c-kube-api-access-hr75f\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.701718 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-template-login\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.701747 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-template-error\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.702718 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6tlr\" (UniqueName: \"kubernetes.io/projected/4b060cb1-e289-4b25-a614-4f14384dafc5-kube-api-access-l6tlr\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.702930 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.703268 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.703490 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.703909 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.704359 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.704449 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.704541 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.704658 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-audit-policies\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.704740 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82015e3-1f43-4442-8064-be5152d3622c-serving-cert\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.704795 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-config\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.704878 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.705035 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-session\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.705118 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b060cb1-e289-4b25-a614-4f14384dafc5-audit-dir\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.705260 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-client-ca\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.737282 4794 scope.go:117] "RemoveContainer" containerID="99ab443a6f31b183b3aa0212c01fccef7cc24ea6bd714af270eb1084b27541de" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.750093 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807076 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr75f\" (UniqueName: \"kubernetes.io/projected/a82015e3-1f43-4442-8064-be5152d3622c-kube-api-access-hr75f\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807144 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-template-login\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807201 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-template-error\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807246 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6tlr\" (UniqueName: \"kubernetes.io/projected/4b060cb1-e289-4b25-a614-4f14384dafc5-kube-api-access-l6tlr\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807305 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807355 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807385 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807428 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807460 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807500 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807535 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807568 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-audit-policies\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807652 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82015e3-1f43-4442-8064-be5152d3622c-serving-cert\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807685 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-config\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807751 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807802 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-session\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807879 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b060cb1-e289-4b25-a614-4f14384dafc5-audit-dir\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.807913 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-client-ca\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.809263 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-client-ca\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.811007 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-audit-policies\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.811446 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.811744 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.812025 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b060cb1-e289-4b25-a614-4f14384dafc5-audit-dir\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.814803 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-session\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.815800 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.816044 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-template-login\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.816171 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.817533 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.819754 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-template-error\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.820136 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82015e3-1f43-4442-8064-be5152d3622c-serving-cert\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.820255 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.821320 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.823996 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-config\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.824607 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b060cb1-e289-4b25-a614-4f14384dafc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.825850 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr75f\" (UniqueName: \"kubernetes.io/projected/a82015e3-1f43-4442-8064-be5152d3622c-kube-api-access-hr75f\") pod \"route-controller-manager-58c5895464-gtr2z\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.834556 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6tlr\" (UniqueName: \"kubernetes.io/projected/4b060cb1-e289-4b25-a614-4f14384dafc5-kube-api-access-l6tlr\") pod \"oauth-openshift-74795b69d5-sqxkz\" (UID: \"4b060cb1-e289-4b25-a614-4f14384dafc5\") " pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.875425 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.916003 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:10 crc kubenswrapper[4794]: I1215 13:59:10.923054 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.014697 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.030133 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.085567 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.105328 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.212132 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.250677 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.266077 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.273155 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" event={"ID":"7f4ac38e-f424-493a-8c65-3002f7ed1626","Type":"ContainerDied","Data":"bd58b4b2233a1ed332aab6ebd408dd0b960459e1186dba3daec20365c484a40e"} Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.273229 4794 scope.go:117] "RemoveContainer" containerID="f128febb89135628d56f80402fcdfc96c099db1292e6461429306c8c0282d2d6" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.273388 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.277984 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.278094 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f305dc84672c7829f3f5bfdc8995b289da4452b3461d39ce7916ea097c713efb"} Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.281569 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" event={"ID":"5ca1b031-98e5-47b0-87da-ca66cee361c3","Type":"ContainerDied","Data":"e3f4ed99ad6b5ae7d6dbbe0c0e279b090ccc1ac386617ee40ab34d7a05938c02"} Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.281738 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8c77c5b59-hdwz6" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.295952 4794 scope.go:117] "RemoveContainer" containerID="f9250d7eda278dd06d741168bd5f13fbfe1506db50bc6b74ca54d5d2eb995c95" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.301668 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn"] Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.309424 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c79c46f4f-5s9xn"] Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.346626 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8c77c5b59-hdwz6"] Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.350322 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8c77c5b59-hdwz6"] Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.454040 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.548511 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.572919 4794 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.573225 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4" gracePeriod=5 Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.601168 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.606875 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.813334 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-74795b69d5-sqxkz"] Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.830865 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z"] Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.849459 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.904817 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9"] Dec 15 13:59:11 crc kubenswrapper[4794]: E1215 13:59:11.905009 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.905019 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.905113 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.905452 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.908460 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.908608 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.908854 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.909314 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.909452 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.910563 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.914888 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9"] Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.929248 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 15 13:59:11 crc kubenswrapper[4794]: I1215 13:59:11.959086 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.030056 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.033398 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-config\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.033431 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-serving-cert\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.033481 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-proxy-ca-bundles\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.033507 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhh6f\" (UniqueName: \"kubernetes.io/projected/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-kube-api-access-vhh6f\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.033537 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-client-ca\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.082736 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.134727 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-client-ca\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.134834 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-config\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.134880 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-serving-cert\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.134962 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-proxy-ca-bundles\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.135007 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhh6f\" (UniqueName: \"kubernetes.io/projected/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-kube-api-access-vhh6f\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.135967 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-client-ca\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.136306 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-config\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.136989 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-proxy-ca-bundles\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.147606 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-serving-cert\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.177506 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhh6f\" (UniqueName: \"kubernetes.io/projected/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-kube-api-access-vhh6f\") pod \"controller-manager-6b5d8ff67b-zxhc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.191118 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.232937 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.269256 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.349279 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.510104 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.620110 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.736145 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.745955 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ca1b031-98e5-47b0-87da-ca66cee361c3" path="/var/lib/kubelet/pods/5ca1b031-98e5-47b0-87da-ca66cee361c3/volumes" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.746630 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f4ac38e-f424-493a-8c65-3002f7ed1626" path="/var/lib/kubelet/pods/7f4ac38e-f424-493a-8c65-3002f7ed1626/volumes" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.750638 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.814390 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 15 13:59:12 crc kubenswrapper[4794]: I1215 13:59:12.895715 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 15 13:59:13 crc kubenswrapper[4794]: I1215 13:59:13.003140 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 15 13:59:13 crc kubenswrapper[4794]: I1215 13:59:13.267794 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 15 13:59:13 crc kubenswrapper[4794]: I1215 13:59:13.440485 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 15 13:59:13 crc kubenswrapper[4794]: I1215 13:59:13.500428 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z"] Dec 15 13:59:13 crc kubenswrapper[4794]: W1215 13:59:13.505468 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda82015e3_1f43_4442_8064_be5152d3622c.slice/crio-e592c767580e80229fe3925b1e4895e5878ed41e697a774c20d0d82c68d5b522 WatchSource:0}: Error finding container e592c767580e80229fe3925b1e4895e5878ed41e697a774c20d0d82c68d5b522: Status 404 returned error can't find the container with id e592c767580e80229fe3925b1e4895e5878ed41e697a774c20d0d82c68d5b522 Dec 15 13:59:13 crc kubenswrapper[4794]: I1215 13:59:13.526690 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 15 13:59:13 crc kubenswrapper[4794]: I1215 13:59:13.567985 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 15 13:59:13 crc kubenswrapper[4794]: I1215 13:59:13.607104 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9"] Dec 15 13:59:13 crc kubenswrapper[4794]: W1215 13:59:13.615317 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d4bdb9e_91d7_4b96_9d1b_ac531bd98cc9.slice/crio-0acdbb4bf82c0a2b3a992dca4469d0487b8c2e8aabced60aadee928419d99871 WatchSource:0}: Error finding container 0acdbb4bf82c0a2b3a992dca4469d0487b8c2e8aabced60aadee928419d99871: Status 404 returned error can't find the container with id 0acdbb4bf82c0a2b3a992dca4469d0487b8c2e8aabced60aadee928419d99871 Dec 15 13:59:13 crc kubenswrapper[4794]: I1215 13:59:13.639595 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-74795b69d5-sqxkz"] Dec 15 13:59:13 crc kubenswrapper[4794]: W1215 13:59:13.651503 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b060cb1_e289_4b25_a614_4f14384dafc5.slice/crio-0b91c33f6b1dbe9e506e584b279a988e22a54df332e19210125f9eb1cff95201 WatchSource:0}: Error finding container 0b91c33f6b1dbe9e506e584b279a988e22a54df332e19210125f9eb1cff95201: Status 404 returned error can't find the container with id 0b91c33f6b1dbe9e506e584b279a988e22a54df332e19210125f9eb1cff95201 Dec 15 13:59:13 crc kubenswrapper[4794]: I1215 13:59:13.998382 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.118898 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.235974 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.258452 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.310452 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" event={"ID":"a82015e3-1f43-4442-8064-be5152d3622c","Type":"ContainerStarted","Data":"b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6"} Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.310500 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" event={"ID":"a82015e3-1f43-4442-8064-be5152d3622c","Type":"ContainerStarted","Data":"e592c767580e80229fe3925b1e4895e5878ed41e697a774c20d0d82c68d5b522"} Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.310621 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.311959 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" event={"ID":"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9","Type":"ContainerStarted","Data":"ab3f9dcc393450d67af11afdab3e3d18621e48228aa0ba062fe8e18d6d7b4eb4"} Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.311991 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" event={"ID":"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9","Type":"ContainerStarted","Data":"0acdbb4bf82c0a2b3a992dca4469d0487b8c2e8aabced60aadee928419d99871"} Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.312130 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.313789 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" event={"ID":"4b060cb1-e289-4b25-a614-4f14384dafc5","Type":"ContainerStarted","Data":"92d03c1d513effe040785248257fe0eaa1ce59d94a5801adb877a81aad3adbf1"} Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.313814 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" event={"ID":"4b060cb1-e289-4b25-a614-4f14384dafc5","Type":"ContainerStarted","Data":"0b91c33f6b1dbe9e506e584b279a988e22a54df332e19210125f9eb1cff95201"} Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.313946 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.317140 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.317980 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.318074 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.327124 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" podStartSLOduration=5.327108004 podStartE2EDuration="5.327108004s" podCreationTimestamp="2025-12-15 13:59:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:59:14.326085274 +0000 UTC m=+316.178107722" watchObservedRunningTime="2025-12-15 13:59:14.327108004 +0000 UTC m=+316.179130442" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.349202 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.353711 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" podStartSLOduration=5.35369622 podStartE2EDuration="5.35369622s" podCreationTimestamp="2025-12-15 13:59:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:59:14.353307138 +0000 UTC m=+316.205329576" watchObservedRunningTime="2025-12-15 13:59:14.35369622 +0000 UTC m=+316.205718658" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.380640 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.393738 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-74795b69d5-sqxkz" podStartSLOduration=58.393719233 podStartE2EDuration="58.393719233s" podCreationTimestamp="2025-12-15 13:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:59:14.39292323 +0000 UTC m=+316.244945688" watchObservedRunningTime="2025-12-15 13:59:14.393719233 +0000 UTC m=+316.245741681" Dec 15 13:59:14 crc kubenswrapper[4794]: I1215 13:59:14.834168 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.186508 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.186812 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229205 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229256 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229319 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229357 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229403 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229518 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229522 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229607 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229561 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229895 4794 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229935 4794 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229960 4794 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.229984 4794 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.236027 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.330658 4794 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.332924 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.332969 4794 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4" exitCode=137 Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.333015 4794 scope.go:117] "RemoveContainer" containerID="b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.333073 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.356118 4794 scope.go:117] "RemoveContainer" containerID="b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4" Dec 15 13:59:17 crc kubenswrapper[4794]: E1215 13:59:17.356736 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4\": container with ID starting with b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4 not found: ID does not exist" containerID="b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4" Dec 15 13:59:17 crc kubenswrapper[4794]: I1215 13:59:17.356803 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4"} err="failed to get container status \"b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4\": rpc error: code = NotFound desc = could not find container \"b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4\": container with ID starting with b5faf64a68ca10609b59c2ead91e92a3972729c09c3a9000b780f6c2dc3680c4 not found: ID does not exist" Dec 15 13:59:18 crc kubenswrapper[4794]: I1215 13:59:18.749947 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 15 13:59:18 crc kubenswrapper[4794]: I1215 13:59:18.750637 4794 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 15 13:59:18 crc kubenswrapper[4794]: I1215 13:59:18.765524 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 15 13:59:18 crc kubenswrapper[4794]: I1215 13:59:18.765576 4794 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="fb80722d-ca7e-4464-b430-0f95f5b72b14" Dec 15 13:59:18 crc kubenswrapper[4794]: I1215 13:59:18.772638 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 15 13:59:18 crc kubenswrapper[4794]: I1215 13:59:18.772918 4794 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="fb80722d-ca7e-4464-b430-0f95f5b72b14" Dec 15 13:59:22 crc kubenswrapper[4794]: I1215 13:59:22.748458 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 15 13:59:24 crc kubenswrapper[4794]: I1215 13:59:24.496507 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 15 13:59:24 crc kubenswrapper[4794]: I1215 13:59:24.783854 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 15 13:59:25 crc kubenswrapper[4794]: I1215 13:59:25.511718 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 15 13:59:25 crc kubenswrapper[4794]: I1215 13:59:25.606864 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 15 13:59:26 crc kubenswrapper[4794]: I1215 13:59:26.990911 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 15 13:59:27 crc kubenswrapper[4794]: I1215 13:59:27.964036 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 15 13:59:28 crc kubenswrapper[4794]: I1215 13:59:28.706129 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.274528 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9"] Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.274857 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" podUID="5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9" containerName="controller-manager" containerID="cri-o://ab3f9dcc393450d67af11afdab3e3d18621e48228aa0ba062fe8e18d6d7b4eb4" gracePeriod=30 Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.309936 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z"] Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.310117 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" podUID="a82015e3-1f43-4442-8064-be5152d3622c" containerName="route-controller-manager" containerID="cri-o://b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6" gracePeriod=30 Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.409728 4794 generic.go:334] "Generic (PLEG): container finished" podID="5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9" containerID="ab3f9dcc393450d67af11afdab3e3d18621e48228aa0ba062fe8e18d6d7b4eb4" exitCode=0 Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.409828 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" event={"ID":"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9","Type":"ContainerDied","Data":"ab3f9dcc393450d67af11afdab3e3d18621e48228aa0ba062fe8e18d6d7b4eb4"} Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.767452 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.823105 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-client-ca\") pod \"a82015e3-1f43-4442-8064-be5152d3622c\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.823182 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82015e3-1f43-4442-8064-be5152d3622c-serving-cert\") pod \"a82015e3-1f43-4442-8064-be5152d3622c\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.823266 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr75f\" (UniqueName: \"kubernetes.io/projected/a82015e3-1f43-4442-8064-be5152d3622c-kube-api-access-hr75f\") pod \"a82015e3-1f43-4442-8064-be5152d3622c\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.823354 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-config\") pod \"a82015e3-1f43-4442-8064-be5152d3622c\" (UID: \"a82015e3-1f43-4442-8064-be5152d3622c\") " Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.823888 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-client-ca" (OuterVolumeSpecName: "client-ca") pod "a82015e3-1f43-4442-8064-be5152d3622c" (UID: "a82015e3-1f43-4442-8064-be5152d3622c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.824363 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-config" (OuterVolumeSpecName: "config") pod "a82015e3-1f43-4442-8064-be5152d3622c" (UID: "a82015e3-1f43-4442-8064-be5152d3622c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.828286 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a82015e3-1f43-4442-8064-be5152d3622c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a82015e3-1f43-4442-8064-be5152d3622c" (UID: "a82015e3-1f43-4442-8064-be5152d3622c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.828703 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a82015e3-1f43-4442-8064-be5152d3622c-kube-api-access-hr75f" (OuterVolumeSpecName: "kube-api-access-hr75f") pod "a82015e3-1f43-4442-8064-be5152d3622c" (UID: "a82015e3-1f43-4442-8064-be5152d3622c"). InnerVolumeSpecName "kube-api-access-hr75f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.838972 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.924245 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-serving-cert\") pod \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.924293 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-config\") pod \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.924354 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhh6f\" (UniqueName: \"kubernetes.io/projected/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-kube-api-access-vhh6f\") pod \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.924381 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-proxy-ca-bundles\") pod \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.924401 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-client-ca\") pod \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\" (UID: \"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9\") " Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.924568 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.924600 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a82015e3-1f43-4442-8064-be5152d3622c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.924613 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82015e3-1f43-4442-8064-be5152d3622c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.924625 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr75f\" (UniqueName: \"kubernetes.io/projected/a82015e3-1f43-4442-8064-be5152d3622c-kube-api-access-hr75f\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.925307 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9" (UID: "5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.926424 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-config" (OuterVolumeSpecName: "config") pod "5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9" (UID: "5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.926444 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9" (UID: "5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.929205 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-kube-api-access-vhh6f" (OuterVolumeSpecName: "kube-api-access-vhh6f") pod "5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9" (UID: "5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9"). InnerVolumeSpecName "kube-api-access-vhh6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 13:59:29 crc kubenswrapper[4794]: I1215 13:59:29.929212 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9" (UID: "5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.025643 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.025690 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-config\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.025710 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhh6f\" (UniqueName: \"kubernetes.io/projected/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-kube-api-access-vhh6f\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.025728 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.025745 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.333834 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.417258 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" event={"ID":"5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9","Type":"ContainerDied","Data":"0acdbb4bf82c0a2b3a992dca4469d0487b8c2e8aabced60aadee928419d99871"} Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.417315 4794 scope.go:117] "RemoveContainer" containerID="ab3f9dcc393450d67af11afdab3e3d18621e48228aa0ba062fe8e18d6d7b4eb4" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.417314 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.421009 4794 generic.go:334] "Generic (PLEG): container finished" podID="a82015e3-1f43-4442-8064-be5152d3622c" containerID="b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6" exitCode=0 Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.421037 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" event={"ID":"a82015e3-1f43-4442-8064-be5152d3622c","Type":"ContainerDied","Data":"b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6"} Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.421053 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" event={"ID":"a82015e3-1f43-4442-8064-be5152d3622c","Type":"ContainerDied","Data":"e592c767580e80229fe3925b1e4895e5878ed41e697a774c20d0d82c68d5b522"} Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.421087 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.443615 4794 scope.go:117] "RemoveContainer" containerID="b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.476631 4794 scope.go:117] "RemoveContainer" containerID="b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6" Dec 15 13:59:30 crc kubenswrapper[4794]: E1215 13:59:30.477917 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6\": container with ID starting with b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6 not found: ID does not exist" containerID="b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.477962 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6"} err="failed to get container status \"b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6\": rpc error: code = NotFound desc = could not find container \"b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6\": container with ID starting with b0dface072ece21631bf75f9c539e3985ac0f5efcafd1a39e67fa4394b812cb6 not found: ID does not exist" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.482846 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z"] Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.486133 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58c5895464-gtr2z"] Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.493713 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9"] Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.497025 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6b5d8ff67b-zxhc9"] Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.751398 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9" path="/var/lib/kubelet/pods/5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9/volumes" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.751905 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a82015e3-1f43-4442-8064-be5152d3622c" path="/var/lib/kubelet/pods/a82015e3-1f43-4442-8064-be5152d3622c/volumes" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.921055 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-787df948cc-5dh8n"] Dec 15 13:59:30 crc kubenswrapper[4794]: E1215 13:59:30.921375 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9" containerName="controller-manager" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.921392 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9" containerName="controller-manager" Dec 15 13:59:30 crc kubenswrapper[4794]: E1215 13:59:30.921402 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a82015e3-1f43-4442-8064-be5152d3622c" containerName="route-controller-manager" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.921409 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a82015e3-1f43-4442-8064-be5152d3622c" containerName="route-controller-manager" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.921530 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4bdb9e-91d7-4b96-9d1b-ac531bd98cc9" containerName="controller-manager" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.921547 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a82015e3-1f43-4442-8064-be5152d3622c" containerName="route-controller-manager" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.922034 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.924551 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.924685 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.924842 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.924920 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.925395 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.925554 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.928424 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9"] Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.929288 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.932516 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.932667 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.933033 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.933165 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.935293 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.936233 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.936459 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-787df948cc-5dh8n"] Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.940533 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 15 13:59:30 crc kubenswrapper[4794]: I1215 13:59:30.942333 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9"] Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.037622 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-config\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.037690 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-client-ca\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.037732 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-config\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.037758 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-proxy-ca-bundles\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.037781 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/761723b9-3e5f-449d-ba36-4a41c9664ab1-serving-cert\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.037803 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-serving-cert\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.037826 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd9vl\" (UniqueName: \"kubernetes.io/projected/761723b9-3e5f-449d-ba36-4a41c9664ab1-kube-api-access-rd9vl\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.037958 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flfsk\" (UniqueName: \"kubernetes.io/projected/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-kube-api-access-flfsk\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.038048 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-client-ca\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.139801 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-serving-cert\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.139854 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9vl\" (UniqueName: \"kubernetes.io/projected/761723b9-3e5f-449d-ba36-4a41c9664ab1-kube-api-access-rd9vl\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.139905 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flfsk\" (UniqueName: \"kubernetes.io/projected/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-kube-api-access-flfsk\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.139937 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-client-ca\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.139994 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-config\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.140028 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-client-ca\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.140055 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-config\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.140077 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-proxy-ca-bundles\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.140101 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/761723b9-3e5f-449d-ba36-4a41c9664ab1-serving-cert\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.141214 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-client-ca\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.141564 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-client-ca\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.141597 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-config\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.141733 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-config\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.144868 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/761723b9-3e5f-449d-ba36-4a41c9664ab1-serving-cert\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.144863 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-proxy-ca-bundles\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.146571 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-serving-cert\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.164179 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flfsk\" (UniqueName: \"kubernetes.io/projected/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-kube-api-access-flfsk\") pod \"controller-manager-787df948cc-5dh8n\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.168280 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9vl\" (UniqueName: \"kubernetes.io/projected/761723b9-3e5f-449d-ba36-4a41c9664ab1-kube-api-access-rd9vl\") pod \"route-controller-manager-7f9d8fdb97-v6xg9\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.243518 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.252742 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.639714 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.639960 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9"] Dec 15 13:59:31 crc kubenswrapper[4794]: W1215 13:59:31.650722 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod761723b9_3e5f_449d_ba36_4a41c9664ab1.slice/crio-b1bd23e4b3d0a5e462ec5bb25112331fe8fa2664ae77f196627b731848953056 WatchSource:0}: Error finding container b1bd23e4b3d0a5e462ec5bb25112331fe8fa2664ae77f196627b731848953056: Status 404 returned error can't find the container with id b1bd23e4b3d0a5e462ec5bb25112331fe8fa2664ae77f196627b731848953056 Dec 15 13:59:31 crc kubenswrapper[4794]: I1215 13:59:31.686355 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-787df948cc-5dh8n"] Dec 15 13:59:31 crc kubenswrapper[4794]: W1215 13:59:31.690606 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8911dd1b_2ad4_42af_ba34_d0cf0b6daa6a.slice/crio-254d95fcd55d18b7f685a61cbe2518ea4cbedbaa8ab7df2e7979f9752d89704b WatchSource:0}: Error finding container 254d95fcd55d18b7f685a61cbe2518ea4cbedbaa8ab7df2e7979f9752d89704b: Status 404 returned error can't find the container with id 254d95fcd55d18b7f685a61cbe2518ea4cbedbaa8ab7df2e7979f9752d89704b Dec 15 13:59:32 crc kubenswrapper[4794]: I1215 13:59:32.439615 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" event={"ID":"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a","Type":"ContainerStarted","Data":"9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404"} Dec 15 13:59:32 crc kubenswrapper[4794]: I1215 13:59:32.439981 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" event={"ID":"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a","Type":"ContainerStarted","Data":"254d95fcd55d18b7f685a61cbe2518ea4cbedbaa8ab7df2e7979f9752d89704b"} Dec 15 13:59:32 crc kubenswrapper[4794]: I1215 13:59:32.440005 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:32 crc kubenswrapper[4794]: I1215 13:59:32.440626 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" event={"ID":"761723b9-3e5f-449d-ba36-4a41c9664ab1","Type":"ContainerStarted","Data":"08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3"} Dec 15 13:59:32 crc kubenswrapper[4794]: I1215 13:59:32.440672 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" event={"ID":"761723b9-3e5f-449d-ba36-4a41c9664ab1","Type":"ContainerStarted","Data":"b1bd23e4b3d0a5e462ec5bb25112331fe8fa2664ae77f196627b731848953056"} Dec 15 13:59:32 crc kubenswrapper[4794]: I1215 13:59:32.440863 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:32 crc kubenswrapper[4794]: I1215 13:59:32.445253 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 13:59:32 crc kubenswrapper[4794]: I1215 13:59:32.446457 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 13:59:32 crc kubenswrapper[4794]: I1215 13:59:32.485088 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" podStartSLOduration=3.485072631 podStartE2EDuration="3.485072631s" podCreationTimestamp="2025-12-15 13:59:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:59:32.466492268 +0000 UTC m=+334.318514726" watchObservedRunningTime="2025-12-15 13:59:32.485072631 +0000 UTC m=+334.337095069" Dec 15 13:59:32 crc kubenswrapper[4794]: I1215 13:59:32.585469 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 15 13:59:32 crc kubenswrapper[4794]: I1215 13:59:32.924760 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 15 13:59:33 crc kubenswrapper[4794]: I1215 13:59:33.359475 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 15 13:59:33 crc kubenswrapper[4794]: I1215 13:59:33.366913 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 15 13:59:33 crc kubenswrapper[4794]: I1215 13:59:33.397352 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 15 13:59:33 crc kubenswrapper[4794]: I1215 13:59:33.455300 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 15 13:59:34 crc kubenswrapper[4794]: I1215 13:59:34.693241 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 15 13:59:35 crc kubenswrapper[4794]: I1215 13:59:35.000660 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 15 13:59:35 crc kubenswrapper[4794]: I1215 13:59:35.130897 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 15 13:59:35 crc kubenswrapper[4794]: I1215 13:59:35.160259 4794 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 15 13:59:35 crc kubenswrapper[4794]: I1215 13:59:35.867334 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 15 13:59:36 crc kubenswrapper[4794]: I1215 13:59:36.985399 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 15 13:59:39 crc kubenswrapper[4794]: I1215 13:59:39.140715 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 15 13:59:39 crc kubenswrapper[4794]: I1215 13:59:39.775797 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 15 13:59:43 crc kubenswrapper[4794]: I1215 13:59:43.288517 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 15 13:59:44 crc kubenswrapper[4794]: I1215 13:59:44.204628 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 15 13:59:44 crc kubenswrapper[4794]: I1215 13:59:44.482845 4794 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 15 13:59:45 crc kubenswrapper[4794]: I1215 13:59:45.148192 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 15 13:59:47 crc kubenswrapper[4794]: I1215 13:59:47.299247 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 15 13:59:49 crc kubenswrapper[4794]: I1215 13:59:49.081695 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 15 13:59:50 crc kubenswrapper[4794]: I1215 13:59:50.114851 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 15 13:59:50 crc kubenswrapper[4794]: I1215 13:59:50.125171 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 15 13:59:51 crc kubenswrapper[4794]: I1215 13:59:51.416054 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 15 13:59:51 crc kubenswrapper[4794]: I1215 13:59:51.767487 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 15 13:59:51 crc kubenswrapper[4794]: I1215 13:59:51.970640 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 15 13:59:53 crc kubenswrapper[4794]: I1215 13:59:53.650627 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 15 13:59:54 crc kubenswrapper[4794]: I1215 13:59:54.534102 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 13:59:54 crc kubenswrapper[4794]: I1215 13:59:54.534388 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 13:59:54 crc kubenswrapper[4794]: I1215 13:59:54.656372 4794 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.176512 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" podStartSLOduration=31.176485021 podStartE2EDuration="31.176485021s" podCreationTimestamp="2025-12-15 13:59:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 13:59:32.503050417 +0000 UTC m=+334.355072855" watchObservedRunningTime="2025-12-15 14:00:00.176485021 +0000 UTC m=+362.028507489" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.179708 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6"] Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.180384 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.182379 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.182500 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.191436 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6"] Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.223311 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8de1613-978c-4087-aabc-04a3ab8e30a8-secret-volume\") pod \"collect-profiles-29430120-8zfb6\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.223370 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwqnv\" (UniqueName: \"kubernetes.io/projected/a8de1613-978c-4087-aabc-04a3ab8e30a8-kube-api-access-jwqnv\") pod \"collect-profiles-29430120-8zfb6\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.223394 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8de1613-978c-4087-aabc-04a3ab8e30a8-config-volume\") pod \"collect-profiles-29430120-8zfb6\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.324843 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8de1613-978c-4087-aabc-04a3ab8e30a8-secret-volume\") pod \"collect-profiles-29430120-8zfb6\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.324893 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwqnv\" (UniqueName: \"kubernetes.io/projected/a8de1613-978c-4087-aabc-04a3ab8e30a8-kube-api-access-jwqnv\") pod \"collect-profiles-29430120-8zfb6\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.324920 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8de1613-978c-4087-aabc-04a3ab8e30a8-config-volume\") pod \"collect-profiles-29430120-8zfb6\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.326082 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8de1613-978c-4087-aabc-04a3ab8e30a8-config-volume\") pod \"collect-profiles-29430120-8zfb6\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.330531 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8de1613-978c-4087-aabc-04a3ab8e30a8-secret-volume\") pod \"collect-profiles-29430120-8zfb6\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.345680 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwqnv\" (UniqueName: \"kubernetes.io/projected/a8de1613-978c-4087-aabc-04a3ab8e30a8-kube-api-access-jwqnv\") pod \"collect-profiles-29430120-8zfb6\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.497106 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:00 crc kubenswrapper[4794]: I1215 14:00:00.749787 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6"] Dec 15 14:00:00 crc kubenswrapper[4794]: W1215 14:00:00.755102 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8de1613_978c_4087_aabc_04a3ab8e30a8.slice/crio-f7c0c26826c3bbecb786fee5cb3728dc568e9397daf8c67af42483158837524e WatchSource:0}: Error finding container f7c0c26826c3bbecb786fee5cb3728dc568e9397daf8c67af42483158837524e: Status 404 returned error can't find the container with id f7c0c26826c3bbecb786fee5cb3728dc568e9397daf8c67af42483158837524e Dec 15 14:00:01 crc kubenswrapper[4794]: I1215 14:00:01.616657 4794 generic.go:334] "Generic (PLEG): container finished" podID="a8de1613-978c-4087-aabc-04a3ab8e30a8" containerID="5bc9c0eb345fddb6819db63203d7ef23f3f8a3f48db9c5ea71859d302f0e62fb" exitCode=0 Dec 15 14:00:01 crc kubenswrapper[4794]: I1215 14:00:01.616709 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" event={"ID":"a8de1613-978c-4087-aabc-04a3ab8e30a8","Type":"ContainerDied","Data":"5bc9c0eb345fddb6819db63203d7ef23f3f8a3f48db9c5ea71859d302f0e62fb"} Dec 15 14:00:01 crc kubenswrapper[4794]: I1215 14:00:01.616740 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" event={"ID":"a8de1613-978c-4087-aabc-04a3ab8e30a8","Type":"ContainerStarted","Data":"f7c0c26826c3bbecb786fee5cb3728dc568e9397daf8c67af42483158837524e"} Dec 15 14:00:02 crc kubenswrapper[4794]: I1215 14:00:02.946412 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:02 crc kubenswrapper[4794]: I1215 14:00:02.963636 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8de1613-978c-4087-aabc-04a3ab8e30a8-config-volume\") pod \"a8de1613-978c-4087-aabc-04a3ab8e30a8\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " Dec 15 14:00:02 crc kubenswrapper[4794]: I1215 14:00:02.963704 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8de1613-978c-4087-aabc-04a3ab8e30a8-secret-volume\") pod \"a8de1613-978c-4087-aabc-04a3ab8e30a8\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " Dec 15 14:00:02 crc kubenswrapper[4794]: I1215 14:00:02.963828 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwqnv\" (UniqueName: \"kubernetes.io/projected/a8de1613-978c-4087-aabc-04a3ab8e30a8-kube-api-access-jwqnv\") pod \"a8de1613-978c-4087-aabc-04a3ab8e30a8\" (UID: \"a8de1613-978c-4087-aabc-04a3ab8e30a8\") " Dec 15 14:00:02 crc kubenswrapper[4794]: I1215 14:00:02.964716 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8de1613-978c-4087-aabc-04a3ab8e30a8-config-volume" (OuterVolumeSpecName: "config-volume") pod "a8de1613-978c-4087-aabc-04a3ab8e30a8" (UID: "a8de1613-978c-4087-aabc-04a3ab8e30a8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:00:02 crc kubenswrapper[4794]: I1215 14:00:02.969626 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8de1613-978c-4087-aabc-04a3ab8e30a8-kube-api-access-jwqnv" (OuterVolumeSpecName: "kube-api-access-jwqnv") pod "a8de1613-978c-4087-aabc-04a3ab8e30a8" (UID: "a8de1613-978c-4087-aabc-04a3ab8e30a8"). InnerVolumeSpecName "kube-api-access-jwqnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:00:02 crc kubenswrapper[4794]: I1215 14:00:02.970830 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8de1613-978c-4087-aabc-04a3ab8e30a8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a8de1613-978c-4087-aabc-04a3ab8e30a8" (UID: "a8de1613-978c-4087-aabc-04a3ab8e30a8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:00:03 crc kubenswrapper[4794]: I1215 14:00:03.065463 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwqnv\" (UniqueName: \"kubernetes.io/projected/a8de1613-978c-4087-aabc-04a3ab8e30a8-kube-api-access-jwqnv\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:03 crc kubenswrapper[4794]: I1215 14:00:03.065534 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8de1613-978c-4087-aabc-04a3ab8e30a8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:03 crc kubenswrapper[4794]: I1215 14:00:03.065548 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8de1613-978c-4087-aabc-04a3ab8e30a8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:03 crc kubenswrapper[4794]: I1215 14:00:03.643872 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" event={"ID":"a8de1613-978c-4087-aabc-04a3ab8e30a8","Type":"ContainerDied","Data":"f7c0c26826c3bbecb786fee5cb3728dc568e9397daf8c67af42483158837524e"} Dec 15 14:00:03 crc kubenswrapper[4794]: I1215 14:00:03.644433 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7c0c26826c3bbecb786fee5cb3728dc568e9397daf8c67af42483158837524e" Dec 15 14:00:03 crc kubenswrapper[4794]: I1215 14:00:03.643966 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430120-8zfb6" Dec 15 14:00:24 crc kubenswrapper[4794]: I1215 14:00:24.536415 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:00:24 crc kubenswrapper[4794]: I1215 14:00:24.537143 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.274622 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-787df948cc-5dh8n"] Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.275484 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" podUID="8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a" containerName="controller-manager" containerID="cri-o://9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404" gracePeriod=30 Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.774116 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.805971 4794 generic.go:334] "Generic (PLEG): container finished" podID="8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a" containerID="9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404" exitCode=0 Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.806007 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" event={"ID":"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a","Type":"ContainerDied","Data":"9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404"} Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.806032 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" event={"ID":"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a","Type":"ContainerDied","Data":"254d95fcd55d18b7f685a61cbe2518ea4cbedbaa8ab7df2e7979f9752d89704b"} Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.806049 4794 scope.go:117] "RemoveContainer" containerID="9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404" Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.806134 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-787df948cc-5dh8n" Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.828943 4794 scope.go:117] "RemoveContainer" containerID="9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404" Dec 15 14:00:29 crc kubenswrapper[4794]: E1215 14:00:29.829562 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404\": container with ID starting with 9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404 not found: ID does not exist" containerID="9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404" Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.829623 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404"} err="failed to get container status \"9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404\": rpc error: code = NotFound desc = could not find container \"9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404\": container with ID starting with 9f15be8e9ddf313eba6618cc7a8830c6f1d71d5ee6859442cff5b1a24eb2b404 not found: ID does not exist" Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.901333 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-config\") pod \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.901408 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-client-ca\") pod \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.901435 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-proxy-ca-bundles\") pod \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.901505 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flfsk\" (UniqueName: \"kubernetes.io/projected/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-kube-api-access-flfsk\") pod \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.901544 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-serving-cert\") pod \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\" (UID: \"8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a\") " Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.902137 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-client-ca" (OuterVolumeSpecName: "client-ca") pod "8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a" (UID: "8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.902342 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a" (UID: "8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.902765 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-config" (OuterVolumeSpecName: "config") pod "8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a" (UID: "8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.909309 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a" (UID: "8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:00:29 crc kubenswrapper[4794]: I1215 14:00:29.910143 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-kube-api-access-flfsk" (OuterVolumeSpecName: "kube-api-access-flfsk") pod "8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a" (UID: "8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a"). InnerVolumeSpecName "kube-api-access-flfsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.002742 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-config\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.002801 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.002819 4794 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.002843 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flfsk\" (UniqueName: \"kubernetes.io/projected/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-kube-api-access-flfsk\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.002901 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.134230 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-787df948cc-5dh8n"] Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.138465 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-787df948cc-5dh8n"] Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.749678 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a" path="/var/lib/kubelet/pods/8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a/volumes" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.982625 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b5d8ff67b-g8696"] Dec 15 14:00:30 crc kubenswrapper[4794]: E1215 14:00:30.983120 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8de1613-978c-4087-aabc-04a3ab8e30a8" containerName="collect-profiles" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.983160 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8de1613-978c-4087-aabc-04a3ab8e30a8" containerName="collect-profiles" Dec 15 14:00:30 crc kubenswrapper[4794]: E1215 14:00:30.983195 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a" containerName="controller-manager" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.983216 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a" containerName="controller-manager" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.983433 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8de1613-978c-4087-aabc-04a3ab8e30a8" containerName="collect-profiles" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.983484 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8911dd1b-2ad4-42af-ba34-d0cf0b6daa6a" containerName="controller-manager" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.984320 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.987455 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.987936 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.988136 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.988722 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.988785 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.992024 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b5d8ff67b-g8696"] Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.992331 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 15 14:00:30 crc kubenswrapper[4794]: I1215 14:00:30.999349 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.119520 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33eed68a-fa36-44ba-839e-8a74a433ce2d-config\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.119662 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/33eed68a-fa36-44ba-839e-8a74a433ce2d-proxy-ca-bundles\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.119709 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33eed68a-fa36-44ba-839e-8a74a433ce2d-serving-cert\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.119732 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8p5p\" (UniqueName: \"kubernetes.io/projected/33eed68a-fa36-44ba-839e-8a74a433ce2d-kube-api-access-t8p5p\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.119775 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33eed68a-fa36-44ba-839e-8a74a433ce2d-client-ca\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.221123 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/33eed68a-fa36-44ba-839e-8a74a433ce2d-proxy-ca-bundles\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.221218 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33eed68a-fa36-44ba-839e-8a74a433ce2d-serving-cert\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.221261 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8p5p\" (UniqueName: \"kubernetes.io/projected/33eed68a-fa36-44ba-839e-8a74a433ce2d-kube-api-access-t8p5p\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.221324 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33eed68a-fa36-44ba-839e-8a74a433ce2d-client-ca\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.221408 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33eed68a-fa36-44ba-839e-8a74a433ce2d-config\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.223241 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33eed68a-fa36-44ba-839e-8a74a433ce2d-client-ca\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.223936 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33eed68a-fa36-44ba-839e-8a74a433ce2d-config\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.223937 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/33eed68a-fa36-44ba-839e-8a74a433ce2d-proxy-ca-bundles\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.226758 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33eed68a-fa36-44ba-839e-8a74a433ce2d-serving-cert\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.250187 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8p5p\" (UniqueName: \"kubernetes.io/projected/33eed68a-fa36-44ba-839e-8a74a433ce2d-kube-api-access-t8p5p\") pod \"controller-manager-6b5d8ff67b-g8696\" (UID: \"33eed68a-fa36-44ba-839e-8a74a433ce2d\") " pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.319770 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.557478 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b5d8ff67b-g8696"] Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.582346 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xcbws"] Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.583891 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.599136 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xcbws"] Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.726690 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-registry-tls\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.727201 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk77b\" (UniqueName: \"kubernetes.io/projected/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-kube-api-access-rk77b\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.727230 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.727252 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.727278 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-registry-certificates\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.727305 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.727329 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-trusted-ca\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.727350 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-bound-sa-token\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.747614 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.818141 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" event={"ID":"33eed68a-fa36-44ba-839e-8a74a433ce2d","Type":"ContainerStarted","Data":"2e1297ca9f155b4170923ece2fff1d8f1d1bb3de9f3709bb16febf8a99c05bb1"} Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.819431 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.819543 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" event={"ID":"33eed68a-fa36-44ba-839e-8a74a433ce2d","Type":"ContainerStarted","Data":"f44accaaa1c40541071573679fbce29f44c83c05c6c8f02b2b158b53604104a4"} Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.820191 4794 patch_prober.go:28] interesting pod/controller-manager-6b5d8ff67b-g8696 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.820227 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" podUID="33eed68a-fa36-44ba-839e-8a74a433ce2d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.829394 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-trusted-ca\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.829455 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-bound-sa-token\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.829526 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-registry-tls\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.829562 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk77b\" (UniqueName: \"kubernetes.io/projected/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-kube-api-access-rk77b\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.829624 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.829661 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-registry-certificates\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.829725 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.830641 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.830912 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-trusted-ca\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.831139 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-registry-certificates\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.835040 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-registry-tls\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.836933 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" podStartSLOduration=2.8369185740000002 podStartE2EDuration="2.836918574s" podCreationTimestamp="2025-12-15 14:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:00:31.835631706 +0000 UTC m=+393.687654154" watchObservedRunningTime="2025-12-15 14:00:31.836918574 +0000 UTC m=+393.688941012" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.837572 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.850152 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-bound-sa-token\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.852419 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk77b\" (UniqueName: \"kubernetes.io/projected/a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05-kube-api-access-rk77b\") pod \"image-registry-66df7c8f76-xcbws\" (UID: \"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05\") " pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:31 crc kubenswrapper[4794]: I1215 14:00:31.898629 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:32 crc kubenswrapper[4794]: I1215 14:00:32.311637 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xcbws"] Dec 15 14:00:32 crc kubenswrapper[4794]: W1215 14:00:32.317160 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0a5abc3_f02f_4eca_a382_ae6e2cb2ef05.slice/crio-ebae8e4a7b01d41d6e4330ecff117d1922e787a13ab2ec1f3974aa92d376bcb5 WatchSource:0}: Error finding container ebae8e4a7b01d41d6e4330ecff117d1922e787a13ab2ec1f3974aa92d376bcb5: Status 404 returned error can't find the container with id ebae8e4a7b01d41d6e4330ecff117d1922e787a13ab2ec1f3974aa92d376bcb5 Dec 15 14:00:32 crc kubenswrapper[4794]: I1215 14:00:32.824850 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" event={"ID":"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05","Type":"ContainerStarted","Data":"e79595984030d1ead46943ec144c59acff1e7c73a517c63626b3d5a352a37f48"} Dec 15 14:00:32 crc kubenswrapper[4794]: I1215 14:00:32.825387 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" event={"ID":"a0a5abc3-f02f-4eca-a382-ae6e2cb2ef05","Type":"ContainerStarted","Data":"ebae8e4a7b01d41d6e4330ecff117d1922e787a13ab2ec1f3974aa92d376bcb5"} Dec 15 14:00:32 crc kubenswrapper[4794]: I1215 14:00:32.831609 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b5d8ff67b-g8696" Dec 15 14:00:32 crc kubenswrapper[4794]: I1215 14:00:32.867497 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" podStartSLOduration=1.867472669 podStartE2EDuration="1.867472669s" podCreationTimestamp="2025-12-15 14:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:00:32.848335886 +0000 UTC m=+394.700358394" watchObservedRunningTime="2025-12-15 14:00:32.867472669 +0000 UTC m=+394.719495137" Dec 15 14:00:33 crc kubenswrapper[4794]: I1215 14:00:33.831961 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.538312 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hj6qt"] Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.539337 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hj6qt" podUID="e64ae040-f93d-4dcc-8dce-812b127b0630" containerName="registry-server" containerID="cri-o://4e563c0ec966978662106c7f60108c8d6cdba60cd75fdd5011a1c12c12288c79" gracePeriod=30 Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.548020 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2lkl6"] Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.548219 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2lkl6" podUID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" containerName="registry-server" containerID="cri-o://9b0713c45dbf8089905f1a8d77cb4732e668e2a79d474b12c71e1be4c5e3225c" gracePeriod=30 Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.570867 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tvkw5"] Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.571064 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" podUID="d502ead2-f4b7-4a34-bd18-6fc872cb30c2" containerName="marketplace-operator" containerID="cri-o://dc7b550f9c643de1cebbec0fa9998711c97b850da26f83ea037382808c6d9c1b" gracePeriod=30 Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.586696 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvh"] Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.587052 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5sjvh" podUID="9f11449c-f57e-46af-bd65-ed450d85ba60" containerName="registry-server" containerID="cri-o://f620805ee7c2db4ed0acd4ca5a446043321fa203315478f9b1f4ba13fa838bbc" gracePeriod=30 Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.599929 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j9rlt"] Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.601188 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.603782 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snr54"] Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.604065 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-snr54" podUID="e7c17057-4b6f-40ed-8901-b8e6249317ed" containerName="registry-server" containerID="cri-o://19b0c6491fb79b59559fe8e3a21705f9ba3355134fbb1e6469ffc24ad2e4d1a6" gracePeriod=30 Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.611411 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j9rlt"] Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.712542 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48bs9\" (UniqueName: \"kubernetes.io/projected/fb02c0bd-f248-4bad-b91c-ed3581cda0bb-kube-api-access-48bs9\") pod \"marketplace-operator-79b997595-j9rlt\" (UID: \"fb02c0bd-f248-4bad-b91c-ed3581cda0bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.712648 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fb02c0bd-f248-4bad-b91c-ed3581cda0bb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j9rlt\" (UID: \"fb02c0bd-f248-4bad-b91c-ed3581cda0bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.712692 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb02c0bd-f248-4bad-b91c-ed3581cda0bb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j9rlt\" (UID: \"fb02c0bd-f248-4bad-b91c-ed3581cda0bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.814773 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb02c0bd-f248-4bad-b91c-ed3581cda0bb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j9rlt\" (UID: \"fb02c0bd-f248-4bad-b91c-ed3581cda0bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.815260 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48bs9\" (UniqueName: \"kubernetes.io/projected/fb02c0bd-f248-4bad-b91c-ed3581cda0bb-kube-api-access-48bs9\") pod \"marketplace-operator-79b997595-j9rlt\" (UID: \"fb02c0bd-f248-4bad-b91c-ed3581cda0bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.815307 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fb02c0bd-f248-4bad-b91c-ed3581cda0bb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j9rlt\" (UID: \"fb02c0bd-f248-4bad-b91c-ed3581cda0bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.816157 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb02c0bd-f248-4bad-b91c-ed3581cda0bb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j9rlt\" (UID: \"fb02c0bd-f248-4bad-b91c-ed3581cda0bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.822816 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fb02c0bd-f248-4bad-b91c-ed3581cda0bb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j9rlt\" (UID: \"fb02c0bd-f248-4bad-b91c-ed3581cda0bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.833169 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48bs9\" (UniqueName: \"kubernetes.io/projected/fb02c0bd-f248-4bad-b91c-ed3581cda0bb-kube-api-access-48bs9\") pod \"marketplace-operator-79b997595-j9rlt\" (UID: \"fb02c0bd-f248-4bad-b91c-ed3581cda0bb\") " pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.933741 4794 generic.go:334] "Generic (PLEG): container finished" podID="9f11449c-f57e-46af-bd65-ed450d85ba60" containerID="f620805ee7c2db4ed0acd4ca5a446043321fa203315478f9b1f4ba13fa838bbc" exitCode=0 Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.933794 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvh" event={"ID":"9f11449c-f57e-46af-bd65-ed450d85ba60","Type":"ContainerDied","Data":"f620805ee7c2db4ed0acd4ca5a446043321fa203315478f9b1f4ba13fa838bbc"} Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.937510 4794 generic.go:334] "Generic (PLEG): container finished" podID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" containerID="9b0713c45dbf8089905f1a8d77cb4732e668e2a79d474b12c71e1be4c5e3225c" exitCode=0 Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.937573 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lkl6" event={"ID":"4e69f94d-9c18-4b19-8290-5e2d86ab4bae","Type":"ContainerDied","Data":"9b0713c45dbf8089905f1a8d77cb4732e668e2a79d474b12c71e1be4c5e3225c"} Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.940749 4794 generic.go:334] "Generic (PLEG): container finished" podID="e64ae040-f93d-4dcc-8dce-812b127b0630" containerID="4e563c0ec966978662106c7f60108c8d6cdba60cd75fdd5011a1c12c12288c79" exitCode=0 Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.940799 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj6qt" event={"ID":"e64ae040-f93d-4dcc-8dce-812b127b0630","Type":"ContainerDied","Data":"4e563c0ec966978662106c7f60108c8d6cdba60cd75fdd5011a1c12c12288c79"} Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.942522 4794 generic.go:334] "Generic (PLEG): container finished" podID="d502ead2-f4b7-4a34-bd18-6fc872cb30c2" containerID="dc7b550f9c643de1cebbec0fa9998711c97b850da26f83ea037382808c6d9c1b" exitCode=0 Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.942619 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" event={"ID":"d502ead2-f4b7-4a34-bd18-6fc872cb30c2","Type":"ContainerDied","Data":"dc7b550f9c643de1cebbec0fa9998711c97b850da26f83ea037382808c6d9c1b"} Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.946234 4794 generic.go:334] "Generic (PLEG): container finished" podID="e7c17057-4b6f-40ed-8901-b8e6249317ed" containerID="19b0c6491fb79b59559fe8e3a21705f9ba3355134fbb1e6469ffc24ad2e4d1a6" exitCode=0 Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.946255 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snr54" event={"ID":"e7c17057-4b6f-40ed-8901-b8e6249317ed","Type":"ContainerDied","Data":"19b0c6491fb79b59559fe8e3a21705f9ba3355134fbb1e6469ffc24ad2e4d1a6"} Dec 15 14:00:48 crc kubenswrapper[4794]: I1215 14:00:48.952366 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.116311 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lkl6" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.222759 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-catalog-content\") pod \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.222873 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-utilities\") pod \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.222930 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8qm2\" (UniqueName: \"kubernetes.io/projected/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-kube-api-access-f8qm2\") pod \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\" (UID: \"4e69f94d-9c18-4b19-8290-5e2d86ab4bae\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.223653 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-utilities" (OuterVolumeSpecName: "utilities") pod "4e69f94d-9c18-4b19-8290-5e2d86ab4bae" (UID: "4e69f94d-9c18-4b19-8290-5e2d86ab4bae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.233677 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-kube-api-access-f8qm2" (OuterVolumeSpecName: "kube-api-access-f8qm2") pod "4e69f94d-9c18-4b19-8290-5e2d86ab4bae" (UID: "4e69f94d-9c18-4b19-8290-5e2d86ab4bae"). InnerVolumeSpecName "kube-api-access-f8qm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.245523 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.256603 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.287955 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.304405 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snr54" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.308328 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e69f94d-9c18-4b19-8290-5e2d86ab4bae" (UID: "4e69f94d-9c18-4b19-8290-5e2d86ab4bae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.313626 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9"] Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.313897 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" podUID="761723b9-3e5f-449d-ba36-4a41c9664ab1" containerName="route-controller-manager" containerID="cri-o://08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3" gracePeriod=30 Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.323926 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-catalog-content\") pod \"9f11449c-f57e-46af-bd65-ed450d85ba60\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.323990 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-utilities\") pod \"e64ae040-f93d-4dcc-8dce-812b127b0630\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.324011 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vm82\" (UniqueName: \"kubernetes.io/projected/9f11449c-f57e-46af-bd65-ed450d85ba60-kube-api-access-2vm82\") pod \"9f11449c-f57e-46af-bd65-ed450d85ba60\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.324043 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnlzz\" (UniqueName: \"kubernetes.io/projected/e64ae040-f93d-4dcc-8dce-812b127b0630-kube-api-access-pnlzz\") pod \"e64ae040-f93d-4dcc-8dce-812b127b0630\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.324083 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-utilities\") pod \"9f11449c-f57e-46af-bd65-ed450d85ba60\" (UID: \"9f11449c-f57e-46af-bd65-ed450d85ba60\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.324137 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-catalog-content\") pod \"e64ae040-f93d-4dcc-8dce-812b127b0630\" (UID: \"e64ae040-f93d-4dcc-8dce-812b127b0630\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.324322 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.324335 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8qm2\" (UniqueName: \"kubernetes.io/projected/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-kube-api-access-f8qm2\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.324345 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e69f94d-9c18-4b19-8290-5e2d86ab4bae-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.326376 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-utilities" (OuterVolumeSpecName: "utilities") pod "9f11449c-f57e-46af-bd65-ed450d85ba60" (UID: "9f11449c-f57e-46af-bd65-ed450d85ba60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.327039 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-utilities" (OuterVolumeSpecName: "utilities") pod "e64ae040-f93d-4dcc-8dce-812b127b0630" (UID: "e64ae040-f93d-4dcc-8dce-812b127b0630"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.328340 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64ae040-f93d-4dcc-8dce-812b127b0630-kube-api-access-pnlzz" (OuterVolumeSpecName: "kube-api-access-pnlzz") pod "e64ae040-f93d-4dcc-8dce-812b127b0630" (UID: "e64ae040-f93d-4dcc-8dce-812b127b0630"). InnerVolumeSpecName "kube-api-access-pnlzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.335874 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f11449c-f57e-46af-bd65-ed450d85ba60-kube-api-access-2vm82" (OuterVolumeSpecName: "kube-api-access-2vm82") pod "9f11449c-f57e-46af-bd65-ed450d85ba60" (UID: "9f11449c-f57e-46af-bd65-ed450d85ba60"). InnerVolumeSpecName "kube-api-access-2vm82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.371986 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f11449c-f57e-46af-bd65-ed450d85ba60" (UID: "9f11449c-f57e-46af-bd65-ed450d85ba60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.394812 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e64ae040-f93d-4dcc-8dce-812b127b0630" (UID: "e64ae040-f93d-4dcc-8dce-812b127b0630"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.425284 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sstwn\" (UniqueName: \"kubernetes.io/projected/e7c17057-4b6f-40ed-8901-b8e6249317ed-kube-api-access-sstwn\") pod \"e7c17057-4b6f-40ed-8901-b8e6249317ed\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.425361 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-operator-metrics\") pod \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.425387 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-catalog-content\") pod \"e7c17057-4b6f-40ed-8901-b8e6249317ed\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.425422 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-trusted-ca\") pod \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.425443 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp9hv\" (UniqueName: \"kubernetes.io/projected/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-kube-api-access-vp9hv\") pod \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\" (UID: \"d502ead2-f4b7-4a34-bd18-6fc872cb30c2\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.425632 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-utilities\") pod \"e7c17057-4b6f-40ed-8901-b8e6249317ed\" (UID: \"e7c17057-4b6f-40ed-8901-b8e6249317ed\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.425944 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.425957 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.425966 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f11449c-f57e-46af-bd65-ed450d85ba60-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.425977 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e64ae040-f93d-4dcc-8dce-812b127b0630-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.426008 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vm82\" (UniqueName: \"kubernetes.io/projected/9f11449c-f57e-46af-bd65-ed450d85ba60-kube-api-access-2vm82\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.426017 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnlzz\" (UniqueName: \"kubernetes.io/projected/e64ae040-f93d-4dcc-8dce-812b127b0630-kube-api-access-pnlzz\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.427006 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-utilities" (OuterVolumeSpecName: "utilities") pod "e7c17057-4b6f-40ed-8901-b8e6249317ed" (UID: "e7c17057-4b6f-40ed-8901-b8e6249317ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.428485 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d502ead2-f4b7-4a34-bd18-6fc872cb30c2" (UID: "d502ead2-f4b7-4a34-bd18-6fc872cb30c2"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.429574 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c17057-4b6f-40ed-8901-b8e6249317ed-kube-api-access-sstwn" (OuterVolumeSpecName: "kube-api-access-sstwn") pod "e7c17057-4b6f-40ed-8901-b8e6249317ed" (UID: "e7c17057-4b6f-40ed-8901-b8e6249317ed"). InnerVolumeSpecName "kube-api-access-sstwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.431772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d502ead2-f4b7-4a34-bd18-6fc872cb30c2" (UID: "d502ead2-f4b7-4a34-bd18-6fc872cb30c2"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.433831 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-kube-api-access-vp9hv" (OuterVolumeSpecName: "kube-api-access-vp9hv") pod "d502ead2-f4b7-4a34-bd18-6fc872cb30c2" (UID: "d502ead2-f4b7-4a34-bd18-6fc872cb30c2"). InnerVolumeSpecName "kube-api-access-vp9hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.505341 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j9rlt"] Dec 15 14:00:49 crc kubenswrapper[4794]: W1215 14:00:49.509163 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb02c0bd_f248_4bad_b91c_ed3581cda0bb.slice/crio-77c99e5668e796f4d43d0431e1ce7cda1b27bbaa2e3f415d465c0f209a8eabee WatchSource:0}: Error finding container 77c99e5668e796f4d43d0431e1ce7cda1b27bbaa2e3f415d465c0f209a8eabee: Status 404 returned error can't find the container with id 77c99e5668e796f4d43d0431e1ce7cda1b27bbaa2e3f415d465c0f209a8eabee Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.528088 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.528128 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sstwn\" (UniqueName: \"kubernetes.io/projected/e7c17057-4b6f-40ed-8901-b8e6249317ed-kube-api-access-sstwn\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.528145 4794 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.528157 4794 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.528169 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp9hv\" (UniqueName: \"kubernetes.io/projected/d502ead2-f4b7-4a34-bd18-6fc872cb30c2-kube-api-access-vp9hv\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.561066 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7c17057-4b6f-40ed-8901-b8e6249317ed" (UID: "e7c17057-4b6f-40ed-8901-b8e6249317ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.629979 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c17057-4b6f-40ed-8901-b8e6249317ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.766511 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.832748 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/761723b9-3e5f-449d-ba36-4a41c9664ab1-serving-cert\") pod \"761723b9-3e5f-449d-ba36-4a41c9664ab1\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.832800 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-client-ca\") pod \"761723b9-3e5f-449d-ba36-4a41c9664ab1\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.832830 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd9vl\" (UniqueName: \"kubernetes.io/projected/761723b9-3e5f-449d-ba36-4a41c9664ab1-kube-api-access-rd9vl\") pod \"761723b9-3e5f-449d-ba36-4a41c9664ab1\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.832869 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-config\") pod \"761723b9-3e5f-449d-ba36-4a41c9664ab1\" (UID: \"761723b9-3e5f-449d-ba36-4a41c9664ab1\") " Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.833590 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-client-ca" (OuterVolumeSpecName: "client-ca") pod "761723b9-3e5f-449d-ba36-4a41c9664ab1" (UID: "761723b9-3e5f-449d-ba36-4a41c9664ab1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.833763 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-config" (OuterVolumeSpecName: "config") pod "761723b9-3e5f-449d-ba36-4a41c9664ab1" (UID: "761723b9-3e5f-449d-ba36-4a41c9664ab1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.840789 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761723b9-3e5f-449d-ba36-4a41c9664ab1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "761723b9-3e5f-449d-ba36-4a41c9664ab1" (UID: "761723b9-3e5f-449d-ba36-4a41c9664ab1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.840800 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761723b9-3e5f-449d-ba36-4a41c9664ab1-kube-api-access-rd9vl" (OuterVolumeSpecName: "kube-api-access-rd9vl") pod "761723b9-3e5f-449d-ba36-4a41c9664ab1" (UID: "761723b9-3e5f-449d-ba36-4a41c9664ab1"). InnerVolumeSpecName "kube-api-access-rd9vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.935745 4794 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/761723b9-3e5f-449d-ba36-4a41c9664ab1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.935854 4794 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.935896 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd9vl\" (UniqueName: \"kubernetes.io/projected/761723b9-3e5f-449d-ba36-4a41c9664ab1-kube-api-access-rd9vl\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.935940 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/761723b9-3e5f-449d-ba36-4a41c9664ab1-config\") on node \"crc\" DevicePath \"\"" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.953894 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lkl6" event={"ID":"4e69f94d-9c18-4b19-8290-5e2d86ab4bae","Type":"ContainerDied","Data":"ed136637b27af5e46288169708c7a63ac0b9b7e0286b2935d897c6b7710a44e4"} Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.953915 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lkl6" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.953965 4794 scope.go:117] "RemoveContainer" containerID="9b0713c45dbf8089905f1a8d77cb4732e668e2a79d474b12c71e1be4c5e3225c" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.955907 4794 generic.go:334] "Generic (PLEG): container finished" podID="761723b9-3e5f-449d-ba36-4a41c9664ab1" containerID="08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3" exitCode=0 Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.955962 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" event={"ID":"761723b9-3e5f-449d-ba36-4a41c9664ab1","Type":"ContainerDied","Data":"08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3"} Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.955986 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" event={"ID":"761723b9-3e5f-449d-ba36-4a41c9664ab1","Type":"ContainerDied","Data":"b1bd23e4b3d0a5e462ec5bb25112331fe8fa2664ae77f196627b731848953056"} Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.956048 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.959259 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" event={"ID":"d502ead2-f4b7-4a34-bd18-6fc872cb30c2","Type":"ContainerDied","Data":"830a09b3f081b076260ed9828989055141e1dd3df85775b2250e975c65400e24"} Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.959361 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tvkw5" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.963748 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj6qt" event={"ID":"e64ae040-f93d-4dcc-8dce-812b127b0630","Type":"ContainerDied","Data":"40a7c8cab81b0c4b28211598fb878413ae6945369cb5c1948cdb74c75dbba010"} Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.963891 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj6qt" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.969218 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" event={"ID":"fb02c0bd-f248-4bad-b91c-ed3581cda0bb","Type":"ContainerStarted","Data":"fbe23cb13e0eee7d6bf8c66c18db6a38310b83b574d2394fa940bd1bc3024caa"} Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.969258 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" event={"ID":"fb02c0bd-f248-4bad-b91c-ed3581cda0bb","Type":"ContainerStarted","Data":"77c99e5668e796f4d43d0431e1ce7cda1b27bbaa2e3f415d465c0f209a8eabee"} Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.969456 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.972697 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.974125 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvh" event={"ID":"9f11449c-f57e-46af-bd65-ed450d85ba60","Type":"ContainerDied","Data":"b0cf56f05e21975c80242bda5f3cceb9ebb8824380ec4f907918cb2fa0f1a899"} Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.974221 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sjvh" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.977617 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snr54" event={"ID":"e7c17057-4b6f-40ed-8901-b8e6249317ed","Type":"ContainerDied","Data":"c08f915e25b892acb13d839c5f74b8d45ad8aedb32b3d505fdc75a268519f881"} Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.977670 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snr54" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.980745 4794 scope.go:117] "RemoveContainer" containerID="1ad01068926b4c29d1a4606e101bc7d8406779049135df3d9ca855e4bbea7021" Dec 15 14:00:49 crc kubenswrapper[4794]: I1215 14:00:49.990392 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-j9rlt" podStartSLOduration=1.990374331 podStartE2EDuration="1.990374331s" podCreationTimestamp="2025-12-15 14:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:00:49.98488909 +0000 UTC m=+411.836911538" watchObservedRunningTime="2025-12-15 14:00:49.990374331 +0000 UTC m=+411.842396789" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.021005 4794 scope.go:117] "RemoveContainer" containerID="accfb95f38d2325c5fa01cd8726b2319341acbf5b134df565c0c0a63dd050a6a" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.024877 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.031637 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f9d8fdb97-v6xg9"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.036431 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2lkl6"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.047625 4794 scope.go:117] "RemoveContainer" containerID="08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.061682 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2lkl6"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.072197 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hj6qt"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.080382 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hj6qt"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.083003 4794 scope.go:117] "RemoveContainer" containerID="08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.084332 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3\": container with ID starting with 08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3 not found: ID does not exist" containerID="08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.084390 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3"} err="failed to get container status \"08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3\": rpc error: code = NotFound desc = could not find container \"08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3\": container with ID starting with 08f456dd228461495068771e23c7ec56ec339af3829c1764ab6b5d31b55715f3 not found: ID does not exist" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.084417 4794 scope.go:117] "RemoveContainer" containerID="dc7b550f9c643de1cebbec0fa9998711c97b850da26f83ea037382808c6d9c1b" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.087013 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvh"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.090204 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvh"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.095659 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tvkw5"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.097595 4794 scope.go:117] "RemoveContainer" containerID="4e563c0ec966978662106c7f60108c8d6cdba60cd75fdd5011a1c12c12288c79" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.099903 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tvkw5"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.104120 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snr54"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.109266 4794 scope.go:117] "RemoveContainer" containerID="b78270170dc1985d94193370da58c8a972dbbfc5ae665327f421e1943e069ec3" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.111193 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-snr54"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.123980 4794 scope.go:117] "RemoveContainer" containerID="c9229114feafcb8f1cb1f2bdc548a53a40bdc3afb091d50ffc8cfe9c3392f9be" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.138243 4794 scope.go:117] "RemoveContainer" containerID="f620805ee7c2db4ed0acd4ca5a446043321fa203315478f9b1f4ba13fa838bbc" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.151678 4794 scope.go:117] "RemoveContainer" containerID="43c3e82725bb5aa95a7d79834af725b011c0a4c42441c2230b230702b8168559" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.165363 4794 scope.go:117] "RemoveContainer" containerID="cfa5656dd220f2e750b1fb30e56de044fd8835c0dc1a3a471eeda191dc3d2cc6" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.176472 4794 scope.go:117] "RemoveContainer" containerID="19b0c6491fb79b59559fe8e3a21705f9ba3355134fbb1e6469ffc24ad2e4d1a6" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.189731 4794 scope.go:117] "RemoveContainer" containerID="58fa4436659564f4c2ab19d4f860a1d79a0ba96dd469987464eb0f9655292cb1" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.201557 4794 scope.go:117] "RemoveContainer" containerID="0c07d7edcd6b9a234a6a380d5f4d430346b8630f367c5f94e3cee5bc6ca41536" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.568943 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cqqqj"] Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569312 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" containerName="extract-content" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569345 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" containerName="extract-content" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569367 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64ae040-f93d-4dcc-8dce-812b127b0630" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569385 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64ae040-f93d-4dcc-8dce-812b127b0630" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569417 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c17057-4b6f-40ed-8901-b8e6249317ed" containerName="extract-content" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569433 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c17057-4b6f-40ed-8901-b8e6249317ed" containerName="extract-content" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569457 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64ae040-f93d-4dcc-8dce-812b127b0630" containerName="extract-utilities" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569474 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64ae040-f93d-4dcc-8dce-812b127b0630" containerName="extract-utilities" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569498 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64ae040-f93d-4dcc-8dce-812b127b0630" containerName="extract-content" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569515 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64ae040-f93d-4dcc-8dce-812b127b0630" containerName="extract-content" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569536 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569552 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569576 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f11449c-f57e-46af-bd65-ed450d85ba60" containerName="extract-utilities" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569654 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f11449c-f57e-46af-bd65-ed450d85ba60" containerName="extract-utilities" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569680 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" containerName="extract-utilities" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569696 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" containerName="extract-utilities" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569715 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c17057-4b6f-40ed-8901-b8e6249317ed" containerName="extract-utilities" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569762 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c17057-4b6f-40ed-8901-b8e6249317ed" containerName="extract-utilities" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569784 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f11449c-f57e-46af-bd65-ed450d85ba60" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569801 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f11449c-f57e-46af-bd65-ed450d85ba60" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569827 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761723b9-3e5f-449d-ba36-4a41c9664ab1" containerName="route-controller-manager" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569844 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="761723b9-3e5f-449d-ba36-4a41c9664ab1" containerName="route-controller-manager" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569871 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c17057-4b6f-40ed-8901-b8e6249317ed" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569886 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c17057-4b6f-40ed-8901-b8e6249317ed" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569907 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f11449c-f57e-46af-bd65-ed450d85ba60" containerName="extract-content" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569924 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f11449c-f57e-46af-bd65-ed450d85ba60" containerName="extract-content" Dec 15 14:00:50 crc kubenswrapper[4794]: E1215 14:00:50.569953 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502ead2-f4b7-4a34-bd18-6fc872cb30c2" containerName="marketplace-operator" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.569969 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502ead2-f4b7-4a34-bd18-6fc872cb30c2" containerName="marketplace-operator" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.570182 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="761723b9-3e5f-449d-ba36-4a41c9664ab1" containerName="route-controller-manager" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.570209 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f11449c-f57e-46af-bd65-ed450d85ba60" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.570239 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c17057-4b6f-40ed-8901-b8e6249317ed" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.570261 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d502ead2-f4b7-4a34-bd18-6fc872cb30c2" containerName="marketplace-operator" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.570283 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.570307 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64ae040-f93d-4dcc-8dce-812b127b0630" containerName="registry-server" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.571884 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.576760 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.582509 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cqqqj"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.648333 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/007bbddd-2d30-424a-b855-a317f4b14c3d-utilities\") pod \"certified-operators-cqqqj\" (UID: \"007bbddd-2d30-424a-b855-a317f4b14c3d\") " pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.648558 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/007bbddd-2d30-424a-b855-a317f4b14c3d-catalog-content\") pod \"certified-operators-cqqqj\" (UID: \"007bbddd-2d30-424a-b855-a317f4b14c3d\") " pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.648673 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5q26\" (UniqueName: \"kubernetes.io/projected/007bbddd-2d30-424a-b855-a317f4b14c3d-kube-api-access-b5q26\") pod \"certified-operators-cqqqj\" (UID: \"007bbddd-2d30-424a-b855-a317f4b14c3d\") " pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.744408 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e69f94d-9c18-4b19-8290-5e2d86ab4bae" path="/var/lib/kubelet/pods/4e69f94d-9c18-4b19-8290-5e2d86ab4bae/volumes" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.745433 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761723b9-3e5f-449d-ba36-4a41c9664ab1" path="/var/lib/kubelet/pods/761723b9-3e5f-449d-ba36-4a41c9664ab1/volumes" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.746059 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f11449c-f57e-46af-bd65-ed450d85ba60" path="/var/lib/kubelet/pods/9f11449c-f57e-46af-bd65-ed450d85ba60/volumes" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.747344 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d502ead2-f4b7-4a34-bd18-6fc872cb30c2" path="/var/lib/kubelet/pods/d502ead2-f4b7-4a34-bd18-6fc872cb30c2/volumes" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.747915 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64ae040-f93d-4dcc-8dce-812b127b0630" path="/var/lib/kubelet/pods/e64ae040-f93d-4dcc-8dce-812b127b0630/volumes" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.748624 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c17057-4b6f-40ed-8901-b8e6249317ed" path="/var/lib/kubelet/pods/e7c17057-4b6f-40ed-8901-b8e6249317ed/volumes" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.749612 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/007bbddd-2d30-424a-b855-a317f4b14c3d-utilities\") pod \"certified-operators-cqqqj\" (UID: \"007bbddd-2d30-424a-b855-a317f4b14c3d\") " pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.749665 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/007bbddd-2d30-424a-b855-a317f4b14c3d-catalog-content\") pod \"certified-operators-cqqqj\" (UID: \"007bbddd-2d30-424a-b855-a317f4b14c3d\") " pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.749693 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5q26\" (UniqueName: \"kubernetes.io/projected/007bbddd-2d30-424a-b855-a317f4b14c3d-kube-api-access-b5q26\") pod \"certified-operators-cqqqj\" (UID: \"007bbddd-2d30-424a-b855-a317f4b14c3d\") " pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.749961 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/007bbddd-2d30-424a-b855-a317f4b14c3d-utilities\") pod \"certified-operators-cqqqj\" (UID: \"007bbddd-2d30-424a-b855-a317f4b14c3d\") " pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.750083 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/007bbddd-2d30-424a-b855-a317f4b14c3d-catalog-content\") pod \"certified-operators-cqqqj\" (UID: \"007bbddd-2d30-424a-b855-a317f4b14c3d\") " pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.769123 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5q26\" (UniqueName: \"kubernetes.io/projected/007bbddd-2d30-424a-b855-a317f4b14c3d-kube-api-access-b5q26\") pod \"certified-operators-cqqqj\" (UID: \"007bbddd-2d30-424a-b855-a317f4b14c3d\") " pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.902135 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.988059 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf"] Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.989044 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.994142 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.994382 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.994436 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.994528 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.995303 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 15 14:00:50 crc kubenswrapper[4794]: I1215 14:00:50.996132 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.021906 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf"] Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.054353 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2598cdcd-f266-4ff0-9a07-42d951ce2af2-client-ca\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.054461 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2598cdcd-f266-4ff0-9a07-42d951ce2af2-serving-cert\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.054497 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84skn\" (UniqueName: \"kubernetes.io/projected/2598cdcd-f266-4ff0-9a07-42d951ce2af2-kube-api-access-84skn\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.054530 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2598cdcd-f266-4ff0-9a07-42d951ce2af2-config\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.151039 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q8dqr"] Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.152612 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.154355 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.155633 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2598cdcd-f266-4ff0-9a07-42d951ce2af2-client-ca\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.155718 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2598cdcd-f266-4ff0-9a07-42d951ce2af2-serving-cert\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.155745 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84skn\" (UniqueName: \"kubernetes.io/projected/2598cdcd-f266-4ff0-9a07-42d951ce2af2-kube-api-access-84skn\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.155785 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2598cdcd-f266-4ff0-9a07-42d951ce2af2-config\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.158723 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2598cdcd-f266-4ff0-9a07-42d951ce2af2-config\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.160639 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2598cdcd-f266-4ff0-9a07-42d951ce2af2-client-ca\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.162828 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8dqr"] Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.164391 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2598cdcd-f266-4ff0-9a07-42d951ce2af2-serving-cert\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.181551 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84skn\" (UniqueName: \"kubernetes.io/projected/2598cdcd-f266-4ff0-9a07-42d951ce2af2-kube-api-access-84skn\") pod \"route-controller-manager-58c5895464-rb9tf\" (UID: \"2598cdcd-f266-4ff0-9a07-42d951ce2af2\") " pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.257308 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf-utilities\") pod \"redhat-marketplace-q8dqr\" (UID: \"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf\") " pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.257811 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf-catalog-content\") pod \"redhat-marketplace-q8dqr\" (UID: \"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf\") " pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.257962 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fcwn\" (UniqueName: \"kubernetes.io/projected/5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf-kube-api-access-6fcwn\") pod \"redhat-marketplace-q8dqr\" (UID: \"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf\") " pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.306983 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cqqqj"] Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.311812 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:51 crc kubenswrapper[4794]: W1215 14:00:51.312321 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod007bbddd_2d30_424a_b855_a317f4b14c3d.slice/crio-3e684ae08b7dd7600db7a713a4ef939fd5844a454193b9b42b4f22ca16774830 WatchSource:0}: Error finding container 3e684ae08b7dd7600db7a713a4ef939fd5844a454193b9b42b4f22ca16774830: Status 404 returned error can't find the container with id 3e684ae08b7dd7600db7a713a4ef939fd5844a454193b9b42b4f22ca16774830 Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.359941 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fcwn\" (UniqueName: \"kubernetes.io/projected/5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf-kube-api-access-6fcwn\") pod \"redhat-marketplace-q8dqr\" (UID: \"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf\") " pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.360026 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf-utilities\") pod \"redhat-marketplace-q8dqr\" (UID: \"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf\") " pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.360090 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf-catalog-content\") pod \"redhat-marketplace-q8dqr\" (UID: \"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf\") " pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.360907 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf-utilities\") pod \"redhat-marketplace-q8dqr\" (UID: \"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf\") " pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.360946 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf-catalog-content\") pod \"redhat-marketplace-q8dqr\" (UID: \"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf\") " pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.387206 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fcwn\" (UniqueName: \"kubernetes.io/projected/5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf-kube-api-access-6fcwn\") pod \"redhat-marketplace-q8dqr\" (UID: \"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf\") " pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:00:51 crc kubenswrapper[4794]: I1215 14:00:51.491479 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:51.704042 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf"] Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:51.904207 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xcbws" Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:51.988956 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q5mq8"] Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.029076 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" event={"ID":"2598cdcd-f266-4ff0-9a07-42d951ce2af2","Type":"ContainerStarted","Data":"7f40c7d169a0c1933b16b5dce17a2b2d6d42c24b8bc747884fc06df9de0e0f67"} Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.029108 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" event={"ID":"2598cdcd-f266-4ff0-9a07-42d951ce2af2","Type":"ContainerStarted","Data":"a2457d7a286317c556e498de75066925e90e6acacae64b6602f559ec04d6ee3e"} Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.030061 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.031798 4794 generic.go:334] "Generic (PLEG): container finished" podID="007bbddd-2d30-424a-b855-a317f4b14c3d" containerID="ad3144ae8ffa3c097c822dbd772dbaa49eaf6914a71343c3bd2e1e92951d1dc3" exitCode=0 Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.032477 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqqqj" event={"ID":"007bbddd-2d30-424a-b855-a317f4b14c3d","Type":"ContainerDied","Data":"ad3144ae8ffa3c097c822dbd772dbaa49eaf6914a71343c3bd2e1e92951d1dc3"} Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.032492 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqqqj" event={"ID":"007bbddd-2d30-424a-b855-a317f4b14c3d","Type":"ContainerStarted","Data":"3e684ae08b7dd7600db7a713a4ef939fd5844a454193b9b42b4f22ca16774830"} Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.047918 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" podStartSLOduration=3.047898996 podStartE2EDuration="3.047898996s" podCreationTimestamp="2025-12-15 14:00:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:00:52.047100972 +0000 UTC m=+413.899123410" watchObservedRunningTime="2025-12-15 14:00:52.047898996 +0000 UTC m=+413.899921434" Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.356466 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58c5895464-rb9tf" Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.576786 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8dqr"] Dec 15 14:00:52 crc kubenswrapper[4794]: W1215 14:00:52.585712 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ccf70d9_cc87_4a5f_acd1_0f07a6054aaf.slice/crio-80d0097155e1013184a3fb13e1611a74e68df7f218edc25e77337895f41d49b4 WatchSource:0}: Error finding container 80d0097155e1013184a3fb13e1611a74e68df7f218edc25e77337895f41d49b4: Status 404 returned error can't find the container with id 80d0097155e1013184a3fb13e1611a74e68df7f218edc25e77337895f41d49b4 Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.957166 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hl98z"] Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.958827 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.961248 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 15 14:00:52 crc kubenswrapper[4794]: I1215 14:00:52.975359 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hl98z"] Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.037369 4794 generic.go:334] "Generic (PLEG): container finished" podID="5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf" containerID="e9de9bccdc57ec3c62697d60e940928c90f24fdad3b356ec59577eeb4ca32523" exitCode=0 Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.037423 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8dqr" event={"ID":"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf","Type":"ContainerDied","Data":"e9de9bccdc57ec3c62697d60e940928c90f24fdad3b356ec59577eeb4ca32523"} Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.037476 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8dqr" event={"ID":"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf","Type":"ContainerStarted","Data":"80d0097155e1013184a3fb13e1611a74e68df7f218edc25e77337895f41d49b4"} Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.093225 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b8mk\" (UniqueName: \"kubernetes.io/projected/1058fe26-4e3d-428c-9fcc-079b0efc1a33-kube-api-access-4b8mk\") pod \"redhat-operators-hl98z\" (UID: \"1058fe26-4e3d-428c-9fcc-079b0efc1a33\") " pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.093378 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1058fe26-4e3d-428c-9fcc-079b0efc1a33-catalog-content\") pod \"redhat-operators-hl98z\" (UID: \"1058fe26-4e3d-428c-9fcc-079b0efc1a33\") " pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.093437 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1058fe26-4e3d-428c-9fcc-079b0efc1a33-utilities\") pod \"redhat-operators-hl98z\" (UID: \"1058fe26-4e3d-428c-9fcc-079b0efc1a33\") " pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.194740 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1058fe26-4e3d-428c-9fcc-079b0efc1a33-utilities\") pod \"redhat-operators-hl98z\" (UID: \"1058fe26-4e3d-428c-9fcc-079b0efc1a33\") " pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.194796 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b8mk\" (UniqueName: \"kubernetes.io/projected/1058fe26-4e3d-428c-9fcc-079b0efc1a33-kube-api-access-4b8mk\") pod \"redhat-operators-hl98z\" (UID: \"1058fe26-4e3d-428c-9fcc-079b0efc1a33\") " pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.194845 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1058fe26-4e3d-428c-9fcc-079b0efc1a33-catalog-content\") pod \"redhat-operators-hl98z\" (UID: \"1058fe26-4e3d-428c-9fcc-079b0efc1a33\") " pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.195264 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1058fe26-4e3d-428c-9fcc-079b0efc1a33-catalog-content\") pod \"redhat-operators-hl98z\" (UID: \"1058fe26-4e3d-428c-9fcc-079b0efc1a33\") " pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.195278 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1058fe26-4e3d-428c-9fcc-079b0efc1a33-utilities\") pod \"redhat-operators-hl98z\" (UID: \"1058fe26-4e3d-428c-9fcc-079b0efc1a33\") " pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.212700 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b8mk\" (UniqueName: \"kubernetes.io/projected/1058fe26-4e3d-428c-9fcc-079b0efc1a33-kube-api-access-4b8mk\") pod \"redhat-operators-hl98z\" (UID: \"1058fe26-4e3d-428c-9fcc-079b0efc1a33\") " pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.280463 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.560164 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6qjdr"] Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.565379 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.568188 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.570229 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qjdr"] Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.700527 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq2vf\" (UniqueName: \"kubernetes.io/projected/69c600b2-aa35-4a90-b45c-d15d6b2650d3-kube-api-access-kq2vf\") pod \"community-operators-6qjdr\" (UID: \"69c600b2-aa35-4a90-b45c-d15d6b2650d3\") " pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.700624 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c600b2-aa35-4a90-b45c-d15d6b2650d3-catalog-content\") pod \"community-operators-6qjdr\" (UID: \"69c600b2-aa35-4a90-b45c-d15d6b2650d3\") " pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.700669 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c600b2-aa35-4a90-b45c-d15d6b2650d3-utilities\") pod \"community-operators-6qjdr\" (UID: \"69c600b2-aa35-4a90-b45c-d15d6b2650d3\") " pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.723536 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hl98z"] Dec 15 14:00:53 crc kubenswrapper[4794]: W1215 14:00:53.726808 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1058fe26_4e3d_428c_9fcc_079b0efc1a33.slice/crio-004a7b16c2e13f85c019f82fb124a1f3b39b924d1642017bc8bb2cf78d06173c WatchSource:0}: Error finding container 004a7b16c2e13f85c019f82fb124a1f3b39b924d1642017bc8bb2cf78d06173c: Status 404 returned error can't find the container with id 004a7b16c2e13f85c019f82fb124a1f3b39b924d1642017bc8bb2cf78d06173c Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.802440 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c600b2-aa35-4a90-b45c-d15d6b2650d3-utilities\") pod \"community-operators-6qjdr\" (UID: \"69c600b2-aa35-4a90-b45c-d15d6b2650d3\") " pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.802527 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq2vf\" (UniqueName: \"kubernetes.io/projected/69c600b2-aa35-4a90-b45c-d15d6b2650d3-kube-api-access-kq2vf\") pod \"community-operators-6qjdr\" (UID: \"69c600b2-aa35-4a90-b45c-d15d6b2650d3\") " pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.802568 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c600b2-aa35-4a90-b45c-d15d6b2650d3-catalog-content\") pod \"community-operators-6qjdr\" (UID: \"69c600b2-aa35-4a90-b45c-d15d6b2650d3\") " pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.803425 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c600b2-aa35-4a90-b45c-d15d6b2650d3-utilities\") pod \"community-operators-6qjdr\" (UID: \"69c600b2-aa35-4a90-b45c-d15d6b2650d3\") " pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.804075 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c600b2-aa35-4a90-b45c-d15d6b2650d3-catalog-content\") pod \"community-operators-6qjdr\" (UID: \"69c600b2-aa35-4a90-b45c-d15d6b2650d3\") " pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.823303 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq2vf\" (UniqueName: \"kubernetes.io/projected/69c600b2-aa35-4a90-b45c-d15d6b2650d3-kube-api-access-kq2vf\") pod \"community-operators-6qjdr\" (UID: \"69c600b2-aa35-4a90-b45c-d15d6b2650d3\") " pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:00:53 crc kubenswrapper[4794]: I1215 14:00:53.885928 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.044280 4794 generic.go:334] "Generic (PLEG): container finished" podID="1058fe26-4e3d-428c-9fcc-079b0efc1a33" containerID="9e76c5418c7d47e5c4ae39b7f1d4395e7e203964607c72a11f114cf8a1231143" exitCode=0 Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.044463 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hl98z" event={"ID":"1058fe26-4e3d-428c-9fcc-079b0efc1a33","Type":"ContainerDied","Data":"9e76c5418c7d47e5c4ae39b7f1d4395e7e203964607c72a11f114cf8a1231143"} Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.044576 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hl98z" event={"ID":"1058fe26-4e3d-428c-9fcc-079b0efc1a33","Type":"ContainerStarted","Data":"004a7b16c2e13f85c019f82fb124a1f3b39b924d1642017bc8bb2cf78d06173c"} Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.053165 4794 generic.go:334] "Generic (PLEG): container finished" podID="5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf" containerID="338946cc8b4504c1d6b0950ced4b17d44c283bd8c2f9ef75e2645ba3f8420d5f" exitCode=0 Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.053280 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8dqr" event={"ID":"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf","Type":"ContainerDied","Data":"338946cc8b4504c1d6b0950ced4b17d44c283bd8c2f9ef75e2645ba3f8420d5f"} Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.056866 4794 generic.go:334] "Generic (PLEG): container finished" podID="007bbddd-2d30-424a-b855-a317f4b14c3d" containerID="b405be9157c363f0b8311d87aea0ba03b5e6731ab60a144b4a607d0d7661c92a" exitCode=0 Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.057027 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqqqj" event={"ID":"007bbddd-2d30-424a-b855-a317f4b14c3d","Type":"ContainerDied","Data":"b405be9157c363f0b8311d87aea0ba03b5e6731ab60a144b4a607d0d7661c92a"} Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.270324 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qjdr"] Dec 15 14:00:54 crc kubenswrapper[4794]: W1215 14:00:54.275679 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69c600b2_aa35_4a90_b45c_d15d6b2650d3.slice/crio-dc7a824778f9d6e9b312fb99bf27c29d8ef708a1ed4e0bdc17f73fca6763129e WatchSource:0}: Error finding container dc7a824778f9d6e9b312fb99bf27c29d8ef708a1ed4e0bdc17f73fca6763129e: Status 404 returned error can't find the container with id dc7a824778f9d6e9b312fb99bf27c29d8ef708a1ed4e0bdc17f73fca6763129e Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.534492 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.534852 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.534907 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.535998 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"153c3cfa232a964aa700c0cbd7b85f063a70242545eaeec076923fd5ffce712a"} pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 15 14:00:54 crc kubenswrapper[4794]: I1215 14:00:54.536099 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" containerID="cri-o://153c3cfa232a964aa700c0cbd7b85f063a70242545eaeec076923fd5ffce712a" gracePeriod=600 Dec 15 14:00:55 crc kubenswrapper[4794]: I1215 14:00:55.066259 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqqqj" event={"ID":"007bbddd-2d30-424a-b855-a317f4b14c3d","Type":"ContainerStarted","Data":"8ebcf60376ce8d44fae59eae246a3f017bf9c26b5fd32f7c9b8fd0101f067c85"} Dec 15 14:00:55 crc kubenswrapper[4794]: I1215 14:00:55.070006 4794 generic.go:334] "Generic (PLEG): container finished" podID="69c600b2-aa35-4a90-b45c-d15d6b2650d3" containerID="821f65627142b3a40a65acb3cf5b163cde524bfc8365974d9fc07e42fb70d071" exitCode=0 Dec 15 14:00:55 crc kubenswrapper[4794]: I1215 14:00:55.070061 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qjdr" event={"ID":"69c600b2-aa35-4a90-b45c-d15d6b2650d3","Type":"ContainerDied","Data":"821f65627142b3a40a65acb3cf5b163cde524bfc8365974d9fc07e42fb70d071"} Dec 15 14:00:55 crc kubenswrapper[4794]: I1215 14:00:55.070079 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qjdr" event={"ID":"69c600b2-aa35-4a90-b45c-d15d6b2650d3","Type":"ContainerStarted","Data":"dc7a824778f9d6e9b312fb99bf27c29d8ef708a1ed4e0bdc17f73fca6763129e"} Dec 15 14:00:55 crc kubenswrapper[4794]: I1215 14:00:55.078779 4794 generic.go:334] "Generic (PLEG): container finished" podID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerID="153c3cfa232a964aa700c0cbd7b85f063a70242545eaeec076923fd5ffce712a" exitCode=0 Dec 15 14:00:55 crc kubenswrapper[4794]: I1215 14:00:55.078851 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerDied","Data":"153c3cfa232a964aa700c0cbd7b85f063a70242545eaeec076923fd5ffce712a"} Dec 15 14:00:55 crc kubenswrapper[4794]: I1215 14:00:55.078878 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"d174f397d9867d6a6419d5dbb9dbec64c765e9ea6a8a992f1d4851ec00451343"} Dec 15 14:00:55 crc kubenswrapper[4794]: I1215 14:00:55.078892 4794 scope.go:117] "RemoveContainer" containerID="f3e1bc0bfca6ad29f6ce842a17dec65fa30d235a2c25ce83ae08af352d4c7e45" Dec 15 14:00:55 crc kubenswrapper[4794]: I1215 14:00:55.087136 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8dqr" event={"ID":"5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf","Type":"ContainerStarted","Data":"07fef9a9f54463ba996e885e23c81267ee27483a64876771a0c4483ba63ba0ec"} Dec 15 14:00:55 crc kubenswrapper[4794]: I1215 14:00:55.090112 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cqqqj" podStartSLOduration=2.25327224 podStartE2EDuration="5.090101537s" podCreationTimestamp="2025-12-15 14:00:50 +0000 UTC" firstStartedPulling="2025-12-15 14:00:52.033773721 +0000 UTC m=+413.885796179" lastFinishedPulling="2025-12-15 14:00:54.870603038 +0000 UTC m=+416.722625476" observedRunningTime="2025-12-15 14:00:55.087473849 +0000 UTC m=+416.939496287" watchObservedRunningTime="2025-12-15 14:00:55.090101537 +0000 UTC m=+416.942123985" Dec 15 14:00:55 crc kubenswrapper[4794]: I1215 14:00:55.148416 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q8dqr" podStartSLOduration=2.5907942999999998 podStartE2EDuration="4.148400949s" podCreationTimestamp="2025-12-15 14:00:51 +0000 UTC" firstStartedPulling="2025-12-15 14:00:53.038920118 +0000 UTC m=+414.890942556" lastFinishedPulling="2025-12-15 14:00:54.596526767 +0000 UTC m=+416.448549205" observedRunningTime="2025-12-15 14:00:55.145467233 +0000 UTC m=+416.997489691" watchObservedRunningTime="2025-12-15 14:00:55.148400949 +0000 UTC m=+417.000423377" Dec 15 14:00:56 crc kubenswrapper[4794]: I1215 14:00:56.096612 4794 generic.go:334] "Generic (PLEG): container finished" podID="1058fe26-4e3d-428c-9fcc-079b0efc1a33" containerID="04e0fa51658c434cf08d48cffd33388d3b109272298e5a78d68120a92df863ea" exitCode=0 Dec 15 14:00:56 crc kubenswrapper[4794]: I1215 14:00:56.097027 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hl98z" event={"ID":"1058fe26-4e3d-428c-9fcc-079b0efc1a33","Type":"ContainerDied","Data":"04e0fa51658c434cf08d48cffd33388d3b109272298e5a78d68120a92df863ea"} Dec 15 14:00:56 crc kubenswrapper[4794]: I1215 14:00:56.101282 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qjdr" event={"ID":"69c600b2-aa35-4a90-b45c-d15d6b2650d3","Type":"ContainerStarted","Data":"df277d98fd81a874114bb5b93e896450faffcd7c06b16767fd4d7aaa3bba78fd"} Dec 15 14:00:57 crc kubenswrapper[4794]: I1215 14:00:57.106990 4794 generic.go:334] "Generic (PLEG): container finished" podID="69c600b2-aa35-4a90-b45c-d15d6b2650d3" containerID="df277d98fd81a874114bb5b93e896450faffcd7c06b16767fd4d7aaa3bba78fd" exitCode=0 Dec 15 14:00:57 crc kubenswrapper[4794]: I1215 14:00:57.107091 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qjdr" event={"ID":"69c600b2-aa35-4a90-b45c-d15d6b2650d3","Type":"ContainerDied","Data":"df277d98fd81a874114bb5b93e896450faffcd7c06b16767fd4d7aaa3bba78fd"} Dec 15 14:00:59 crc kubenswrapper[4794]: I1215 14:00:59.124710 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hl98z" event={"ID":"1058fe26-4e3d-428c-9fcc-079b0efc1a33","Type":"ContainerStarted","Data":"6d0677f5772dae20c0faf15b1d62a1ccbfd14bcda72bd6d36fb29c5cc8790010"} Dec 15 14:00:59 crc kubenswrapper[4794]: I1215 14:00:59.127181 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qjdr" event={"ID":"69c600b2-aa35-4a90-b45c-d15d6b2650d3","Type":"ContainerStarted","Data":"9dba10c67937901080d627549ecfcade6face8de402f633cc6a4b7b4171aaf42"} Dec 15 14:00:59 crc kubenswrapper[4794]: I1215 14:00:59.166226 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hl98z" podStartSLOduration=4.030612257 podStartE2EDuration="7.166208102s" podCreationTimestamp="2025-12-15 14:00:52 +0000 UTC" firstStartedPulling="2025-12-15 14:00:54.04622466 +0000 UTC m=+415.898247118" lastFinishedPulling="2025-12-15 14:00:57.181820535 +0000 UTC m=+419.033842963" observedRunningTime="2025-12-15 14:00:59.15082473 +0000 UTC m=+421.002847188" watchObservedRunningTime="2025-12-15 14:00:59.166208102 +0000 UTC m=+421.018230560" Dec 15 14:00:59 crc kubenswrapper[4794]: I1215 14:00:59.184748 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6qjdr" podStartSLOduration=3.034225213 podStartE2EDuration="6.184731156s" podCreationTimestamp="2025-12-15 14:00:53 +0000 UTC" firstStartedPulling="2025-12-15 14:00:55.072335595 +0000 UTC m=+416.924358033" lastFinishedPulling="2025-12-15 14:00:58.222841518 +0000 UTC m=+420.074863976" observedRunningTime="2025-12-15 14:00:59.182724327 +0000 UTC m=+421.034746785" watchObservedRunningTime="2025-12-15 14:00:59.184731156 +0000 UTC m=+421.036753604" Dec 15 14:01:00 crc kubenswrapper[4794]: I1215 14:01:00.903270 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:01:00 crc kubenswrapper[4794]: I1215 14:01:00.903923 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:01:00 crc kubenswrapper[4794]: I1215 14:01:00.945057 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:01:01 crc kubenswrapper[4794]: I1215 14:01:01.178429 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cqqqj" Dec 15 14:01:01 crc kubenswrapper[4794]: I1215 14:01:01.492813 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:01:01 crc kubenswrapper[4794]: I1215 14:01:01.492856 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:01:01 crc kubenswrapper[4794]: I1215 14:01:01.562309 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:01:02 crc kubenswrapper[4794]: I1215 14:01:02.189738 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q8dqr" Dec 15 14:01:03 crc kubenswrapper[4794]: I1215 14:01:03.281447 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:01:03 crc kubenswrapper[4794]: I1215 14:01:03.281496 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:01:03 crc kubenswrapper[4794]: I1215 14:01:03.886366 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:01:03 crc kubenswrapper[4794]: I1215 14:01:03.886399 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:01:03 crc kubenswrapper[4794]: I1215 14:01:03.930121 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:01:04 crc kubenswrapper[4794]: I1215 14:01:04.198277 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6qjdr" Dec 15 14:01:04 crc kubenswrapper[4794]: I1215 14:01:04.339719 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hl98z" podUID="1058fe26-4e3d-428c-9fcc-079b0efc1a33" containerName="registry-server" probeResult="failure" output=< Dec 15 14:01:04 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Dec 15 14:01:04 crc kubenswrapper[4794]: > Dec 15 14:01:13 crc kubenswrapper[4794]: I1215 14:01:13.341767 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:01:13 crc kubenswrapper[4794]: I1215 14:01:13.399232 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hl98z" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.022530 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" podUID="c235fb96-c626-4949-bd6e-a21dd37bc9d1" containerName="registry" containerID="cri-o://15380e638f9143fff388d4cc74e8a21779d87a7bdc36831e2bb73929408d13b7" gracePeriod=30 Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.234053 4794 generic.go:334] "Generic (PLEG): container finished" podID="c235fb96-c626-4949-bd6e-a21dd37bc9d1" containerID="15380e638f9143fff388d4cc74e8a21779d87a7bdc36831e2bb73929408d13b7" exitCode=0 Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.234104 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" event={"ID":"c235fb96-c626-4949-bd6e-a21dd37bc9d1","Type":"ContainerDied","Data":"15380e638f9143fff388d4cc74e8a21779d87a7bdc36831e2bb73929408d13b7"} Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.461858 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.552144 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-trusted-ca\") pod \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.552190 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c235fb96-c626-4949-bd6e-a21dd37bc9d1-installation-pull-secrets\") pod \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.552283 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c235fb96-c626-4949-bd6e-a21dd37bc9d1-ca-trust-extracted\") pod \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.552331 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-tls\") pod \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.552375 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-bound-sa-token\") pod \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.552410 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxgk6\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-kube-api-access-nxgk6\") pod \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.552565 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.552649 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-certificates\") pod \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\" (UID: \"c235fb96-c626-4949-bd6e-a21dd37bc9d1\") " Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.553062 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c235fb96-c626-4949-bd6e-a21dd37bc9d1" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.553418 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c235fb96-c626-4949-bd6e-a21dd37bc9d1" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.561144 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c235fb96-c626-4949-bd6e-a21dd37bc9d1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c235fb96-c626-4949-bd6e-a21dd37bc9d1" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.564748 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c235fb96-c626-4949-bd6e-a21dd37bc9d1" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.565226 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-kube-api-access-nxgk6" (OuterVolumeSpecName: "kube-api-access-nxgk6") pod "c235fb96-c626-4949-bd6e-a21dd37bc9d1" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1"). InnerVolumeSpecName "kube-api-access-nxgk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.568108 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c235fb96-c626-4949-bd6e-a21dd37bc9d1" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.568759 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c235fb96-c626-4949-bd6e-a21dd37bc9d1" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.574364 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c235fb96-c626-4949-bd6e-a21dd37bc9d1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c235fb96-c626-4949-bd6e-a21dd37bc9d1" (UID: "c235fb96-c626-4949-bd6e-a21dd37bc9d1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.654961 4794 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c235fb96-c626-4949-bd6e-a21dd37bc9d1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.655002 4794 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.655014 4794 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.655027 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxgk6\" (UniqueName: \"kubernetes.io/projected/c235fb96-c626-4949-bd6e-a21dd37bc9d1-kube-api-access-nxgk6\") on node \"crc\" DevicePath \"\"" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.655043 4794 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.655055 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c235fb96-c626-4949-bd6e-a21dd37bc9d1-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:01:17 crc kubenswrapper[4794]: I1215 14:01:17.655067 4794 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c235fb96-c626-4949-bd6e-a21dd37bc9d1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 15 14:01:18 crc kubenswrapper[4794]: I1215 14:01:18.243623 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" event={"ID":"c235fb96-c626-4949-bd6e-a21dd37bc9d1","Type":"ContainerDied","Data":"9fd04e3f1cc0ccbdc1517cdac96318f7769744f5954692c2d952eb0b4fddb472"} Dec 15 14:01:18 crc kubenswrapper[4794]: I1215 14:01:18.243686 4794 scope.go:117] "RemoveContainer" containerID="15380e638f9143fff388d4cc74e8a21779d87a7bdc36831e2bb73929408d13b7" Dec 15 14:01:18 crc kubenswrapper[4794]: I1215 14:01:18.243800 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q5mq8" Dec 15 14:01:18 crc kubenswrapper[4794]: I1215 14:01:18.289629 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q5mq8"] Dec 15 14:01:18 crc kubenswrapper[4794]: I1215 14:01:18.295294 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q5mq8"] Dec 15 14:01:18 crc kubenswrapper[4794]: I1215 14:01:18.747138 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c235fb96-c626-4949-bd6e-a21dd37bc9d1" path="/var/lib/kubelet/pods/c235fb96-c626-4949-bd6e-a21dd37bc9d1/volumes" Dec 15 14:02:54 crc kubenswrapper[4794]: I1215 14:02:54.534897 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:02:54 crc kubenswrapper[4794]: I1215 14:02:54.535633 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:02:59 crc kubenswrapper[4794]: I1215 14:02:59.053639 4794 scope.go:117] "RemoveContainer" containerID="ea63348063b5827a95cf5eb6936939fa93531af90af599752f2e7de48dea2ecf" Dec 15 14:03:24 crc kubenswrapper[4794]: I1215 14:03:24.534381 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:03:24 crc kubenswrapper[4794]: I1215 14:03:24.535101 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:03:54 crc kubenswrapper[4794]: I1215 14:03:54.534398 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:03:54 crc kubenswrapper[4794]: I1215 14:03:54.535108 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:03:54 crc kubenswrapper[4794]: I1215 14:03:54.535182 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 14:03:54 crc kubenswrapper[4794]: I1215 14:03:54.535977 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d174f397d9867d6a6419d5dbb9dbec64c765e9ea6a8a992f1d4851ec00451343"} pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 15 14:03:54 crc kubenswrapper[4794]: I1215 14:03:54.536040 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" containerID="cri-o://d174f397d9867d6a6419d5dbb9dbec64c765e9ea6a8a992f1d4851ec00451343" gracePeriod=600 Dec 15 14:03:55 crc kubenswrapper[4794]: I1215 14:03:55.256866 4794 generic.go:334] "Generic (PLEG): container finished" podID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerID="d174f397d9867d6a6419d5dbb9dbec64c765e9ea6a8a992f1d4851ec00451343" exitCode=0 Dec 15 14:03:55 crc kubenswrapper[4794]: I1215 14:03:55.256938 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerDied","Data":"d174f397d9867d6a6419d5dbb9dbec64c765e9ea6a8a992f1d4851ec00451343"} Dec 15 14:03:55 crc kubenswrapper[4794]: I1215 14:03:55.257460 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"abdd64a3133374cb12ad4cc9a75da62f63e41a60a16866f7f2bcba98242002ca"} Dec 15 14:03:55 crc kubenswrapper[4794]: I1215 14:03:55.257486 4794 scope.go:117] "RemoveContainer" containerID="153c3cfa232a964aa700c0cbd7b85f063a70242545eaeec076923fd5ffce712a" Dec 15 14:05:54 crc kubenswrapper[4794]: I1215 14:05:54.533954 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:05:54 crc kubenswrapper[4794]: I1215 14:05:54.534812 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:06:24 crc kubenswrapper[4794]: I1215 14:06:24.534806 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:06:24 crc kubenswrapper[4794]: I1215 14:06:24.535308 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:06:30 crc kubenswrapper[4794]: I1215 14:06:30.904955 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578"] Dec 15 14:06:30 crc kubenswrapper[4794]: E1215 14:06:30.905693 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c235fb96-c626-4949-bd6e-a21dd37bc9d1" containerName="registry" Dec 15 14:06:30 crc kubenswrapper[4794]: I1215 14:06:30.905714 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c235fb96-c626-4949-bd6e-a21dd37bc9d1" containerName="registry" Dec 15 14:06:30 crc kubenswrapper[4794]: I1215 14:06:30.905880 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c235fb96-c626-4949-bd6e-a21dd37bc9d1" containerName="registry" Dec 15 14:06:30 crc kubenswrapper[4794]: I1215 14:06:30.907132 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:30 crc kubenswrapper[4794]: I1215 14:06:30.913484 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 15 14:06:30 crc kubenswrapper[4794]: I1215 14:06:30.923134 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv4kx\" (UniqueName: \"kubernetes.io/projected/736d163e-7de0-4138-8a7e-74db6b7d5efc-kube-api-access-qv4kx\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:30 crc kubenswrapper[4794]: I1215 14:06:30.923203 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:30 crc kubenswrapper[4794]: I1215 14:06:30.923255 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:30 crc kubenswrapper[4794]: I1215 14:06:30.932344 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578"] Dec 15 14:06:31 crc kubenswrapper[4794]: I1215 14:06:31.024212 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:31 crc kubenswrapper[4794]: I1215 14:06:31.024618 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv4kx\" (UniqueName: \"kubernetes.io/projected/736d163e-7de0-4138-8a7e-74db6b7d5efc-kube-api-access-qv4kx\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:31 crc kubenswrapper[4794]: I1215 14:06:31.024647 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:31 crc kubenswrapper[4794]: I1215 14:06:31.025006 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:31 crc kubenswrapper[4794]: I1215 14:06:31.025060 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:31 crc kubenswrapper[4794]: I1215 14:06:31.042294 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv4kx\" (UniqueName: \"kubernetes.io/projected/736d163e-7de0-4138-8a7e-74db6b7d5efc-kube-api-access-qv4kx\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:31 crc kubenswrapper[4794]: I1215 14:06:31.282486 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:31 crc kubenswrapper[4794]: I1215 14:06:31.468625 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578"] Dec 15 14:06:32 crc kubenswrapper[4794]: I1215 14:06:32.291052 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" event={"ID":"736d163e-7de0-4138-8a7e-74db6b7d5efc","Type":"ContainerStarted","Data":"2ed32bfc2919e34904386b7bd60fe9e58fddde6ad143f0912d357f9fa3425105"} Dec 15 14:06:32 crc kubenswrapper[4794]: I1215 14:06:32.291446 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" event={"ID":"736d163e-7de0-4138-8a7e-74db6b7d5efc","Type":"ContainerStarted","Data":"9aaf98ca4106fa93b501fa6b97c716194fa39d52fd2ee79fa78f12344b4bf164"} Dec 15 14:06:33 crc kubenswrapper[4794]: I1215 14:06:33.303960 4794 generic.go:334] "Generic (PLEG): container finished" podID="736d163e-7de0-4138-8a7e-74db6b7d5efc" containerID="2ed32bfc2919e34904386b7bd60fe9e58fddde6ad143f0912d357f9fa3425105" exitCode=0 Dec 15 14:06:33 crc kubenswrapper[4794]: I1215 14:06:33.304039 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" event={"ID":"736d163e-7de0-4138-8a7e-74db6b7d5efc","Type":"ContainerDied","Data":"2ed32bfc2919e34904386b7bd60fe9e58fddde6ad143f0912d357f9fa3425105"} Dec 15 14:06:33 crc kubenswrapper[4794]: I1215 14:06:33.306546 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 15 14:06:35 crc kubenswrapper[4794]: I1215 14:06:35.316147 4794 generic.go:334] "Generic (PLEG): container finished" podID="736d163e-7de0-4138-8a7e-74db6b7d5efc" containerID="dae06cdbd12f94f4ca5fb17bb9ffdc11640dc2a60819ff28f01e64432bc7b7fb" exitCode=0 Dec 15 14:06:35 crc kubenswrapper[4794]: I1215 14:06:35.316221 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" event={"ID":"736d163e-7de0-4138-8a7e-74db6b7d5efc","Type":"ContainerDied","Data":"dae06cdbd12f94f4ca5fb17bb9ffdc11640dc2a60819ff28f01e64432bc7b7fb"} Dec 15 14:06:36 crc kubenswrapper[4794]: I1215 14:06:36.324626 4794 generic.go:334] "Generic (PLEG): container finished" podID="736d163e-7de0-4138-8a7e-74db6b7d5efc" containerID="6d1b984502560d94c479f9c2487db3cf102e187b9724d4cf6ca782a856c64001" exitCode=0 Dec 15 14:06:36 crc kubenswrapper[4794]: I1215 14:06:36.324690 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" event={"ID":"736d163e-7de0-4138-8a7e-74db6b7d5efc","Type":"ContainerDied","Data":"6d1b984502560d94c479f9c2487db3cf102e187b9724d4cf6ca782a856c64001"} Dec 15 14:06:37 crc kubenswrapper[4794]: I1215 14:06:37.602196 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:37 crc kubenswrapper[4794]: I1215 14:06:37.607371 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-util\") pod \"736d163e-7de0-4138-8a7e-74db6b7d5efc\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " Dec 15 14:06:37 crc kubenswrapper[4794]: I1215 14:06:37.607487 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv4kx\" (UniqueName: \"kubernetes.io/projected/736d163e-7de0-4138-8a7e-74db6b7d5efc-kube-api-access-qv4kx\") pod \"736d163e-7de0-4138-8a7e-74db6b7d5efc\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " Dec 15 14:06:37 crc kubenswrapper[4794]: I1215 14:06:37.607535 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-bundle\") pod \"736d163e-7de0-4138-8a7e-74db6b7d5efc\" (UID: \"736d163e-7de0-4138-8a7e-74db6b7d5efc\") " Dec 15 14:06:37 crc kubenswrapper[4794]: I1215 14:06:37.610893 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-bundle" (OuterVolumeSpecName: "bundle") pod "736d163e-7de0-4138-8a7e-74db6b7d5efc" (UID: "736d163e-7de0-4138-8a7e-74db6b7d5efc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:06:37 crc kubenswrapper[4794]: I1215 14:06:37.613768 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736d163e-7de0-4138-8a7e-74db6b7d5efc-kube-api-access-qv4kx" (OuterVolumeSpecName: "kube-api-access-qv4kx") pod "736d163e-7de0-4138-8a7e-74db6b7d5efc" (UID: "736d163e-7de0-4138-8a7e-74db6b7d5efc"). InnerVolumeSpecName "kube-api-access-qv4kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:06:37 crc kubenswrapper[4794]: I1215 14:06:37.620178 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-util" (OuterVolumeSpecName: "util") pod "736d163e-7de0-4138-8a7e-74db6b7d5efc" (UID: "736d163e-7de0-4138-8a7e-74db6b7d5efc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:06:37 crc kubenswrapper[4794]: I1215 14:06:37.709133 4794 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-util\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:37 crc kubenswrapper[4794]: I1215 14:06:37.709167 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv4kx\" (UniqueName: \"kubernetes.io/projected/736d163e-7de0-4138-8a7e-74db6b7d5efc-kube-api-access-qv4kx\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:37 crc kubenswrapper[4794]: I1215 14:06:37.709177 4794 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/736d163e-7de0-4138-8a7e-74db6b7d5efc-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:38 crc kubenswrapper[4794]: I1215 14:06:38.334840 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" event={"ID":"736d163e-7de0-4138-8a7e-74db6b7d5efc","Type":"ContainerDied","Data":"9aaf98ca4106fa93b501fa6b97c716194fa39d52fd2ee79fa78f12344b4bf164"} Dec 15 14:06:38 crc kubenswrapper[4794]: I1215 14:06:38.334878 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aaf98ca4106fa93b501fa6b97c716194fa39d52fd2ee79fa78f12344b4bf164" Dec 15 14:06:38 crc kubenswrapper[4794]: I1215 14:06:38.334897 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578" Dec 15 14:06:41 crc kubenswrapper[4794]: I1215 14:06:41.707926 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cwnfl"] Dec 15 14:06:41 crc kubenswrapper[4794]: I1215 14:06:41.711910 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovn-controller" containerID="cri-o://65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9" gracePeriod=30 Dec 15 14:06:41 crc kubenswrapper[4794]: I1215 14:06:41.712562 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="northd" containerID="cri-o://bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757" gracePeriod=30 Dec 15 14:06:41 crc kubenswrapper[4794]: I1215 14:06:41.712873 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="sbdb" containerID="cri-o://f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7" gracePeriod=30 Dec 15 14:06:41 crc kubenswrapper[4794]: I1215 14:06:41.712865 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="kube-rbac-proxy-node" containerID="cri-o://ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81" gracePeriod=30 Dec 15 14:06:41 crc kubenswrapper[4794]: I1215 14:06:41.712954 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="nbdb" containerID="cri-o://891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e" gracePeriod=30 Dec 15 14:06:41 crc kubenswrapper[4794]: I1215 14:06:41.712995 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovn-acl-logging" containerID="cri-o://9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486" gracePeriod=30 Dec 15 14:06:41 crc kubenswrapper[4794]: I1215 14:06:41.713053 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f" gracePeriod=30 Dec 15 14:06:41 crc kubenswrapper[4794]: I1215 14:06:41.754307 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" containerID="cri-o://903c510f249c8b109fb6a118aad01f0358640936b98688be21968f0a1b3024ad" gracePeriod=30 Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.149493 4794 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.360009 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t9nm7_0bc89ecc-eb8e-4926-bbb7-14c90f449e00/kube-multus/2.log" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.360366 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t9nm7_0bc89ecc-eb8e-4926-bbb7-14c90f449e00/kube-multus/1.log" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.360397 4794 generic.go:334] "Generic (PLEG): container finished" podID="0bc89ecc-eb8e-4926-bbb7-14c90f449e00" containerID="76c2abef58cd747a318832e9c9a8f53a0a66e4077a34e21078a3dc23196cbc26" exitCode=2 Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.360434 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9nm7" event={"ID":"0bc89ecc-eb8e-4926-bbb7-14c90f449e00","Type":"ContainerDied","Data":"76c2abef58cd747a318832e9c9a8f53a0a66e4077a34e21078a3dc23196cbc26"} Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.360463 4794 scope.go:117] "RemoveContainer" containerID="38615e948620ac0ca417cc4875e38bf8eb38dd1d7e5da7afe562469cb19cb171" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.360854 4794 scope.go:117] "RemoveContainer" containerID="76c2abef58cd747a318832e9c9a8f53a0a66e4077a34e21078a3dc23196cbc26" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.363567 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/3.log" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.367265 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovn-acl-logging/0.log" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.367763 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovn-controller/0.log" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368142 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="903c510f249c8b109fb6a118aad01f0358640936b98688be21968f0a1b3024ad" exitCode=0 Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368160 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7" exitCode=0 Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368167 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e" exitCode=0 Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368173 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757" exitCode=0 Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368180 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f" exitCode=0 Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368186 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81" exitCode=0 Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368193 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486" exitCode=143 Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368200 4794 generic.go:334] "Generic (PLEG): container finished" podID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerID="65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9" exitCode=143 Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368218 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"903c510f249c8b109fb6a118aad01f0358640936b98688be21968f0a1b3024ad"} Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368241 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7"} Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368251 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e"} Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368260 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757"} Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368268 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f"} Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368277 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81"} Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368286 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486"} Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.368294 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9"} Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.398296 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovnkube-controller/3.log" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.401682 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovn-acl-logging/0.log" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.402229 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovn-controller/0.log" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.402724 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.407078 4794 scope.go:117] "RemoveContainer" containerID="a7397ff9ef60a563020910c2ed7fac0a53a3ac87e23316bc9e93b87d5e6acd4c" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.455950 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tpc7b"] Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456160 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="kube-rbac-proxy-node" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456180 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="kube-rbac-proxy-node" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456191 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736d163e-7de0-4138-8a7e-74db6b7d5efc" containerName="pull" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456199 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="736d163e-7de0-4138-8a7e-74db6b7d5efc" containerName="pull" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456211 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456218 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456231 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="nbdb" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456238 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="nbdb" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456247 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736d163e-7de0-4138-8a7e-74db6b7d5efc" containerName="extract" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456255 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="736d163e-7de0-4138-8a7e-74db6b7d5efc" containerName="extract" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456265 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456273 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456283 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovn-acl-logging" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456290 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovn-acl-logging" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456300 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="kube-rbac-proxy-ovn-metrics" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456309 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="kube-rbac-proxy-ovn-metrics" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456319 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736d163e-7de0-4138-8a7e-74db6b7d5efc" containerName="util" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456326 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="736d163e-7de0-4138-8a7e-74db6b7d5efc" containerName="util" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456337 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456344 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456351 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456358 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456367 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovn-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456374 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovn-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456384 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="kubecfg-setup" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456392 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="kubecfg-setup" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456404 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="northd" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456411 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="northd" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456421 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="sbdb" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456427 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="sbdb" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456530 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="nbdb" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456545 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456553 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="sbdb" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456561 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="kube-rbac-proxy-ovn-metrics" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456569 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovn-acl-logging" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456596 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456604 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456611 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovn-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456622 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="kube-rbac-proxy-node" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456632 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="northd" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456640 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="736d163e-7de0-4138-8a7e-74db6b7d5efc" containerName="extract" Dec 15 14:06:42 crc kubenswrapper[4794]: E1215 14:06:42.456750 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456759 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456869 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.456882 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" containerName="ovnkube-controller" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.458491 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465657 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465697 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-slash\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465718 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-log-socket\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465741 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-ovn-kubernetes\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465769 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-slash" (OuterVolumeSpecName: "host-slash") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465785 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-systemd\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-log-socket" (OuterVolumeSpecName: "log-socket") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465794 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465805 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-ovn\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465828 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465849 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-netns\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465881 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465914 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465869 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-etc-openvswitch\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465972 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxv8r\" (UniqueName: \"kubernetes.io/projected/628fdda9-19ac-4a1d-a93b-82a10124a8ad-kube-api-access-pxv8r\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.465980 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.466018 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-script-lib\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.466605 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-node-log" (OuterVolumeSpecName: "node-log") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.466879 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.466913 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-node-log\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.466936 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-config\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.466954 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-netd\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.466983 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovn-node-metrics-cert\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.466996 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-openvswitch\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467089 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467275 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-kubelet\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467312 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-bin\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467327 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467369 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467479 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-env-overrides\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467481 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467631 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467736 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-var-lib-openvswitch\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467765 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467807 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-systemd-units\") pod \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\" (UID: \"628fdda9-19ac-4a1d-a93b-82a10124a8ad\") " Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467851 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.467945 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468007 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-kubelet\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468032 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-cni-bin\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468065 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-cni-netd\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468088 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-run-ovn\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468112 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-slash\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468185 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-run-openvswitch\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468205 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36ab2da4-842d-45f3-8581-bfee4d800e85-env-overrides\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468226 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-run-systemd\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468248 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468278 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-systemd-units\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468307 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36ab2da4-842d-45f3-8581-bfee4d800e85-ovnkube-script-lib\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468333 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-node-log\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468365 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-run-ovn-kubernetes\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468397 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-run-netns\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468437 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6927r\" (UniqueName: \"kubernetes.io/projected/36ab2da4-842d-45f3-8581-bfee4d800e85-kube-api-access-6927r\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468469 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-log-socket\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468500 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36ab2da4-842d-45f3-8581-bfee4d800e85-ovn-node-metrics-cert\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468527 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-etc-openvswitch\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468551 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-var-lib-openvswitch\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468569 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36ab2da4-842d-45f3-8581-bfee4d800e85-ovnkube-config\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468666 4794 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468681 4794 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468693 4794 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468705 4794 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468718 4794 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468730 4794 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-node-log\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468741 4794 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468752 4794 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468763 4794 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468773 4794 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/628fdda9-19ac-4a1d-a93b-82a10124a8ad-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468785 4794 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468796 4794 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468806 4794 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468817 4794 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468829 4794 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-log-socket\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468841 4794 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-slash\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.468853 4794 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.471592 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/628fdda9-19ac-4a1d-a93b-82a10124a8ad-kube-api-access-pxv8r" (OuterVolumeSpecName: "kube-api-access-pxv8r") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "kube-api-access-pxv8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.474676 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.487633 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "628fdda9-19ac-4a1d-a93b-82a10124a8ad" (UID: "628fdda9-19ac-4a1d-a93b-82a10124a8ad"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570009 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-systemd-units\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570062 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36ab2da4-842d-45f3-8581-bfee4d800e85-ovnkube-script-lib\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570147 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-systemd-units\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570226 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-node-log\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570666 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36ab2da4-842d-45f3-8581-bfee4d800e85-ovnkube-script-lib\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570723 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-node-log\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570772 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-run-ovn-kubernetes\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-run-netns\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570845 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-run-ovn-kubernetes\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570874 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6927r\" (UniqueName: \"kubernetes.io/projected/36ab2da4-842d-45f3-8581-bfee4d800e85-kube-api-access-6927r\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570894 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-log-socket\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570942 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-log-socket\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570910 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36ab2da4-842d-45f3-8581-bfee4d800e85-ovn-node-metrics-cert\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.570981 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-etc-openvswitch\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571036 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-run-netns\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571002 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-var-lib-openvswitch\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571052 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-etc-openvswitch\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571091 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36ab2da4-842d-45f3-8581-bfee4d800e85-ovnkube-config\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571140 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-var-lib-openvswitch\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571199 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-kubelet\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571226 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-cni-bin\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571236 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-kubelet\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571269 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-cni-bin\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571270 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-cni-netd\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571298 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-cni-netd\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571323 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-run-ovn\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571351 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-slash\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571406 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-run-openvswitch\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571412 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-run-ovn\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571428 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36ab2da4-842d-45f3-8581-bfee4d800e85-env-overrides\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571452 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-run-systemd\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571427 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-slash\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571476 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571491 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-run-systemd\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571454 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-run-openvswitch\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571532 4794 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/628fdda9-19ac-4a1d-a93b-82a10124a8ad-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571535 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36ab2da4-842d-45f3-8581-bfee4d800e85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571545 4794 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/628fdda9-19ac-4a1d-a93b-82a10124a8ad-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571556 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxv8r\" (UniqueName: \"kubernetes.io/projected/628fdda9-19ac-4a1d-a93b-82a10124a8ad-kube-api-access-pxv8r\") on node \"crc\" DevicePath \"\"" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571926 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36ab2da4-842d-45f3-8581-bfee4d800e85-env-overrides\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.571991 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36ab2da4-842d-45f3-8581-bfee4d800e85-ovnkube-config\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.574148 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36ab2da4-842d-45f3-8581-bfee4d800e85-ovn-node-metrics-cert\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.584750 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6927r\" (UniqueName: \"kubernetes.io/projected/36ab2da4-842d-45f3-8581-bfee4d800e85-kube-api-access-6927r\") pod \"ovnkube-node-tpc7b\" (UID: \"36ab2da4-842d-45f3-8581-bfee4d800e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: I1215 14:06:42.779878 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:42 crc kubenswrapper[4794]: W1215 14:06:42.807467 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36ab2da4_842d_45f3_8581_bfee4d800e85.slice/crio-d0436bbbbc99e2114aa8ae4129d9e65a6b9eb0867052b739a5297cb8423fff7c WatchSource:0}: Error finding container d0436bbbbc99e2114aa8ae4129d9e65a6b9eb0867052b739a5297cb8423fff7c: Status 404 returned error can't find the container with id d0436bbbbc99e2114aa8ae4129d9e65a6b9eb0867052b739a5297cb8423fff7c Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.374335 4794 generic.go:334] "Generic (PLEG): container finished" podID="36ab2da4-842d-45f3-8581-bfee4d800e85" containerID="0da953db17126fc433c73954801ea79aab7afb5bce270ebd8c214f848e952a7e" exitCode=0 Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.374425 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" event={"ID":"36ab2da4-842d-45f3-8581-bfee4d800e85","Type":"ContainerDied","Data":"0da953db17126fc433c73954801ea79aab7afb5bce270ebd8c214f848e952a7e"} Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.374751 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" event={"ID":"36ab2da4-842d-45f3-8581-bfee4d800e85","Type":"ContainerStarted","Data":"d0436bbbbc99e2114aa8ae4129d9e65a6b9eb0867052b739a5297cb8423fff7c"} Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.378147 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovn-acl-logging/0.log" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.379183 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cwnfl_628fdda9-19ac-4a1d-a93b-82a10124a8ad/ovn-controller/0.log" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.379553 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" event={"ID":"628fdda9-19ac-4a1d-a93b-82a10124a8ad","Type":"ContainerDied","Data":"2af415f6f69391cfb1f68c3874b3b2fce7fa22a82cbab5a1d7b70729c0713f85"} Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.379615 4794 scope.go:117] "RemoveContainer" containerID="903c510f249c8b109fb6a118aad01f0358640936b98688be21968f0a1b3024ad" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.379617 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cwnfl" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.381563 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t9nm7_0bc89ecc-eb8e-4926-bbb7-14c90f449e00/kube-multus/2.log" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.381622 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9nm7" event={"ID":"0bc89ecc-eb8e-4926-bbb7-14c90f449e00","Type":"ContainerStarted","Data":"90bef556a53081916c669e7c4c55491e7200f46120f45d0e4d5067774144bd74"} Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.402065 4794 scope.go:117] "RemoveContainer" containerID="f0a1541577c2d452cc2dff66b5ea8e2cf63a2751f890c263b11e4d73fc2f4bc7" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.426398 4794 scope.go:117] "RemoveContainer" containerID="891c86dd4b76c02adb1b22cc4523cc6fa74e784c8389590d9085e2f03d361e1e" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.448277 4794 scope.go:117] "RemoveContainer" containerID="bab72abdf60819ff5059ca37fb7d367ec40bb4df077a973f3bd6ef7569220757" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.451153 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cwnfl"] Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.462371 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cwnfl"] Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.467610 4794 scope.go:117] "RemoveContainer" containerID="81d023f17e43c4003281ada8b68dec4b4132be72c4430a3be14ab6c21cb9f59f" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.486535 4794 scope.go:117] "RemoveContainer" containerID="ec7a7c3e420802f52a9b7ce695966a081139c2a52f1101cd7811398f0f2acc81" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.507134 4794 scope.go:117] "RemoveContainer" containerID="9cd77be01b701491480ebb9cf29d96ac366fe3af802eac44fc09e846e424c486" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.538647 4794 scope.go:117] "RemoveContainer" containerID="65ddcb27a08252a7b2ea19162b6fcad57fb2b8f9080ef6221abbb73b8d9511b9" Dec 15 14:06:43 crc kubenswrapper[4794]: I1215 14:06:43.604990 4794 scope.go:117] "RemoveContainer" containerID="9f1ee99cafea3a945e195656a4058cd2d449b58befe0bfe52b7c4299ca1cb8e2" Dec 15 14:06:44 crc kubenswrapper[4794]: I1215 14:06:44.404794 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" event={"ID":"36ab2da4-842d-45f3-8581-bfee4d800e85","Type":"ContainerStarted","Data":"c0212de7bb413acdcc120adcffa81b31111de12d5a1a61ec5082770418187a0d"} Dec 15 14:06:44 crc kubenswrapper[4794]: I1215 14:06:44.405094 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" event={"ID":"36ab2da4-842d-45f3-8581-bfee4d800e85","Type":"ContainerStarted","Data":"2cb0d91e24bde8c6e1b713b3434c72bb4290d027b5742f695619253b868ee39f"} Dec 15 14:06:44 crc kubenswrapper[4794]: I1215 14:06:44.405106 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" event={"ID":"36ab2da4-842d-45f3-8581-bfee4d800e85","Type":"ContainerStarted","Data":"39e29f09117e47714bf425836b6eb9b561139363d0a79ccc9a086081c349f8c3"} Dec 15 14:06:44 crc kubenswrapper[4794]: I1215 14:06:44.405116 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" event={"ID":"36ab2da4-842d-45f3-8581-bfee4d800e85","Type":"ContainerStarted","Data":"f6dab8bdfb9cf5834c15d9e1259c67f022000644cd6bdea2360671a058a7f3b4"} Dec 15 14:06:44 crc kubenswrapper[4794]: I1215 14:06:44.405124 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" event={"ID":"36ab2da4-842d-45f3-8581-bfee4d800e85","Type":"ContainerStarted","Data":"1dbcd9558914d4334a85afbd9f2b15a11fdb965fc34bb7f681886f1fc26c06d8"} Dec 15 14:06:44 crc kubenswrapper[4794]: I1215 14:06:44.405132 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" event={"ID":"36ab2da4-842d-45f3-8581-bfee4d800e85","Type":"ContainerStarted","Data":"444dcdec9d8b5069f498c9c54db0c47dcb2537adb8fee120c7340aa8893ac395"} Dec 15 14:06:44 crc kubenswrapper[4794]: I1215 14:06:44.742435 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="628fdda9-19ac-4a1d-a93b-82a10124a8ad" path="/var/lib/kubelet/pods/628fdda9-19ac-4a1d-a93b-82a10124a8ad/volumes" Dec 15 14:06:47 crc kubenswrapper[4794]: I1215 14:06:47.421880 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" event={"ID":"36ab2da4-842d-45f3-8581-bfee4d800e85","Type":"ContainerStarted","Data":"242f8520d36e44f20f400b9f6fc31e8ec9c7bae3b536edbee785a4379b34576f"} Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.749800 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp"] Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.750872 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.759353 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.759609 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.763103 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-85bl6" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.823524 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8"] Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.824184 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.826447 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-q55bc" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.826685 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.829555 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9"] Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.830248 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.945972 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/097c120c-be56-4089-b752-36706c337bcf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8\" (UID: \"097c120c-be56-4089-b752-36706c337bcf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.946015 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01c0c788-8deb-456f-b515-255500832030-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9\" (UID: \"01c0c788-8deb-456f-b515-255500832030\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.946061 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01c0c788-8deb-456f-b515-255500832030-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9\" (UID: \"01c0c788-8deb-456f-b515-255500832030\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.946102 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m8n7\" (UniqueName: \"kubernetes.io/projected/661de35e-03be-4902-9273-ae4f7d165a16-kube-api-access-8m8n7\") pod \"obo-prometheus-operator-668cf9dfbb-d4bvp\" (UID: \"661de35e-03be-4902-9273-ae4f7d165a16\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.946315 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/097c120c-be56-4089-b752-36706c337bcf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8\" (UID: \"097c120c-be56-4089-b752-36706c337bcf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.965796 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-5cd75"] Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.966464 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.970390 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-2hr2k" Dec 15 14:06:48 crc kubenswrapper[4794]: I1215 14:06:48.970552 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.047302 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01c0c788-8deb-456f-b515-255500832030-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9\" (UID: \"01c0c788-8deb-456f-b515-255500832030\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.047386 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m8n7\" (UniqueName: \"kubernetes.io/projected/661de35e-03be-4902-9273-ae4f7d165a16-kube-api-access-8m8n7\") pod \"obo-prometheus-operator-668cf9dfbb-d4bvp\" (UID: \"661de35e-03be-4902-9273-ae4f7d165a16\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.047434 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/097c120c-be56-4089-b752-36706c337bcf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8\" (UID: \"097c120c-be56-4089-b752-36706c337bcf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.047455 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/097c120c-be56-4089-b752-36706c337bcf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8\" (UID: \"097c120c-be56-4089-b752-36706c337bcf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.047477 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01c0c788-8deb-456f-b515-255500832030-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9\" (UID: \"01c0c788-8deb-456f-b515-255500832030\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.052918 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/097c120c-be56-4089-b752-36706c337bcf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8\" (UID: \"097c120c-be56-4089-b752-36706c337bcf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.053438 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01c0c788-8deb-456f-b515-255500832030-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9\" (UID: \"01c0c788-8deb-456f-b515-255500832030\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.054015 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/097c120c-be56-4089-b752-36706c337bcf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8\" (UID: \"097c120c-be56-4089-b752-36706c337bcf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.054701 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01c0c788-8deb-456f-b515-255500832030-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9\" (UID: \"01c0c788-8deb-456f-b515-255500832030\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.072892 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-qsqps"] Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.073712 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.074126 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m8n7\" (UniqueName: \"kubernetes.io/projected/661de35e-03be-4902-9273-ae4f7d165a16-kube-api-access-8m8n7\") pod \"obo-prometheus-operator-668cf9dfbb-d4bvp\" (UID: \"661de35e-03be-4902-9273-ae4f7d165a16\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.075495 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-5xt4w" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.082235 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.105721 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators_661de35e-03be-4902-9273-ae4f7d165a16_0(e6d820332edd6dc61e51ebfbd3ffd4fda95169e5b05a85e665621075d0c45beb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.105782 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators_661de35e-03be-4902-9273-ae4f7d165a16_0(e6d820332edd6dc61e51ebfbd3ffd4fda95169e5b05a85e665621075d0c45beb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.105804 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators_661de35e-03be-4902-9273-ae4f7d165a16_0(e6d820332edd6dc61e51ebfbd3ffd4fda95169e5b05a85e665621075d0c45beb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.105847 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators(661de35e-03be-4902-9273-ae4f7d165a16)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators(661de35e-03be-4902-9273-ae4f7d165a16)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators_661de35e-03be-4902-9273-ae4f7d165a16_0(e6d820332edd6dc61e51ebfbd3ffd4fda95169e5b05a85e665621075d0c45beb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" podUID="661de35e-03be-4902-9273-ae4f7d165a16" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.142603 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.151598 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqcth\" (UniqueName: \"kubernetes.io/projected/f160af63-166e-45de-8a47-cf3fbda615ed-kube-api-access-cqcth\") pod \"observability-operator-d8bb48f5d-5cd75\" (UID: \"f160af63-166e-45de-8a47-cf3fbda615ed\") " pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.151647 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92vts\" (UniqueName: \"kubernetes.io/projected/981129d7-9a49-4888-a19c-3c2924e854c8-kube-api-access-92vts\") pod \"perses-operator-5446b9c989-qsqps\" (UID: \"981129d7-9a49-4888-a19c-3c2924e854c8\") " pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.151699 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f160af63-166e-45de-8a47-cf3fbda615ed-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-5cd75\" (UID: \"f160af63-166e-45de-8a47-cf3fbda615ed\") " pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.151730 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/981129d7-9a49-4888-a19c-3c2924e854c8-openshift-service-ca\") pod \"perses-operator-5446b9c989-qsqps\" (UID: \"981129d7-9a49-4888-a19c-3c2924e854c8\") " pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.153719 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.170451 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators_097c120c-be56-4089-b752-36706c337bcf_0(a09ecc2838cc0bf9fac102cbdcdb05a246dcf2cc790b484f9a9189db77049c09): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.170509 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators_097c120c-be56-4089-b752-36706c337bcf_0(a09ecc2838cc0bf9fac102cbdcdb05a246dcf2cc790b484f9a9189db77049c09): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.170531 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators_097c120c-be56-4089-b752-36706c337bcf_0(a09ecc2838cc0bf9fac102cbdcdb05a246dcf2cc790b484f9a9189db77049c09): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.170720 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators(097c120c-be56-4089-b752-36706c337bcf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators(097c120c-be56-4089-b752-36706c337bcf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators_097c120c-be56-4089-b752-36706c337bcf_0(a09ecc2838cc0bf9fac102cbdcdb05a246dcf2cc790b484f9a9189db77049c09): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" podUID="097c120c-be56-4089-b752-36706c337bcf" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.177179 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators_01c0c788-8deb-456f-b515-255500832030_0(b679a8ebab35996f286bd8d18da30e0eccc9d624a132ced9df46f3fc9627dcbb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.177229 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators_01c0c788-8deb-456f-b515-255500832030_0(b679a8ebab35996f286bd8d18da30e0eccc9d624a132ced9df46f3fc9627dcbb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.177252 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators_01c0c788-8deb-456f-b515-255500832030_0(b679a8ebab35996f286bd8d18da30e0eccc9d624a132ced9df46f3fc9627dcbb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.177291 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators(01c0c788-8deb-456f-b515-255500832030)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators(01c0c788-8deb-456f-b515-255500832030)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators_01c0c788-8deb-456f-b515-255500832030_0(b679a8ebab35996f286bd8d18da30e0eccc9d624a132ced9df46f3fc9627dcbb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" podUID="01c0c788-8deb-456f-b515-255500832030" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.252385 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f160af63-166e-45de-8a47-cf3fbda615ed-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-5cd75\" (UID: \"f160af63-166e-45de-8a47-cf3fbda615ed\") " pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.252432 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/981129d7-9a49-4888-a19c-3c2924e854c8-openshift-service-ca\") pod \"perses-operator-5446b9c989-qsqps\" (UID: \"981129d7-9a49-4888-a19c-3c2924e854c8\") " pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.252481 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqcth\" (UniqueName: \"kubernetes.io/projected/f160af63-166e-45de-8a47-cf3fbda615ed-kube-api-access-cqcth\") pod \"observability-operator-d8bb48f5d-5cd75\" (UID: \"f160af63-166e-45de-8a47-cf3fbda615ed\") " pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.252504 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92vts\" (UniqueName: \"kubernetes.io/projected/981129d7-9a49-4888-a19c-3c2924e854c8-kube-api-access-92vts\") pod \"perses-operator-5446b9c989-qsqps\" (UID: \"981129d7-9a49-4888-a19c-3c2924e854c8\") " pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.253540 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/981129d7-9a49-4888-a19c-3c2924e854c8-openshift-service-ca\") pod \"perses-operator-5446b9c989-qsqps\" (UID: \"981129d7-9a49-4888-a19c-3c2924e854c8\") " pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.255398 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f160af63-166e-45de-8a47-cf3fbda615ed-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-5cd75\" (UID: \"f160af63-166e-45de-8a47-cf3fbda615ed\") " pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.273526 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92vts\" (UniqueName: \"kubernetes.io/projected/981129d7-9a49-4888-a19c-3c2924e854c8-kube-api-access-92vts\") pod \"perses-operator-5446b9c989-qsqps\" (UID: \"981129d7-9a49-4888-a19c-3c2924e854c8\") " pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.275553 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqcth\" (UniqueName: \"kubernetes.io/projected/f160af63-166e-45de-8a47-cf3fbda615ed-kube-api-access-cqcth\") pod \"observability-operator-d8bb48f5d-5cd75\" (UID: \"f160af63-166e-45de-8a47-cf3fbda615ed\") " pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.279082 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.300068 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5cd75_openshift-operators_f160af63-166e-45de-8a47-cf3fbda615ed_0(edf02550b950ba21c05bb5ec34affe812ce51bafbf95e26e1b47356e14bcd515): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.300121 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5cd75_openshift-operators_f160af63-166e-45de-8a47-cf3fbda615ed_0(edf02550b950ba21c05bb5ec34affe812ce51bafbf95e26e1b47356e14bcd515): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.300147 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5cd75_openshift-operators_f160af63-166e-45de-8a47-cf3fbda615ed_0(edf02550b950ba21c05bb5ec34affe812ce51bafbf95e26e1b47356e14bcd515): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.300199 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-5cd75_openshift-operators(f160af63-166e-45de-8a47-cf3fbda615ed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-5cd75_openshift-operators(f160af63-166e-45de-8a47-cf3fbda615ed)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5cd75_openshift-operators_f160af63-166e-45de-8a47-cf3fbda615ed_0(edf02550b950ba21c05bb5ec34affe812ce51bafbf95e26e1b47356e14bcd515): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" podUID="f160af63-166e-45de-8a47-cf3fbda615ed" Dec 15 14:06:49 crc kubenswrapper[4794]: I1215 14:06:49.406803 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.428677 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qsqps_openshift-operators_981129d7-9a49-4888-a19c-3c2924e854c8_0(266833da8a2f4ed683aaf1683dd2b4fa93717cf674690234ad24a4e6fa6ad66b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.428748 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qsqps_openshift-operators_981129d7-9a49-4888-a19c-3c2924e854c8_0(266833da8a2f4ed683aaf1683dd2b4fa93717cf674690234ad24a4e6fa6ad66b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.428774 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qsqps_openshift-operators_981129d7-9a49-4888-a19c-3c2924e854c8_0(266833da8a2f4ed683aaf1683dd2b4fa93717cf674690234ad24a4e6fa6ad66b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:49 crc kubenswrapper[4794]: E1215 14:06:49.428829 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-qsqps_openshift-operators(981129d7-9a49-4888-a19c-3c2924e854c8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-qsqps_openshift-operators(981129d7-9a49-4888-a19c-3c2924e854c8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qsqps_openshift-operators_981129d7-9a49-4888-a19c-3c2924e854c8_0(266833da8a2f4ed683aaf1683dd2b4fa93717cf674690234ad24a4e6fa6ad66b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-qsqps" podUID="981129d7-9a49-4888-a19c-3c2924e854c8" Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.442120 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" event={"ID":"36ab2da4-842d-45f3-8581-bfee4d800e85","Type":"ContainerStarted","Data":"9a0d5c611a9d7e1ef01696ba8d4124981547cb7597a2c892a79257baa3208dc7"} Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.800223 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-qsqps"] Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.800368 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.800808 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.808836 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp"] Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.808928 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.809300 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.828798 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qsqps_openshift-operators_981129d7-9a49-4888-a19c-3c2924e854c8_0(e8a8071567baebf3d7a9dd2e3d30d9abd68a673a5ffc51f6bc40f2305ffe2525): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.828859 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qsqps_openshift-operators_981129d7-9a49-4888-a19c-3c2924e854c8_0(e8a8071567baebf3d7a9dd2e3d30d9abd68a673a5ffc51f6bc40f2305ffe2525): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.828877 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qsqps_openshift-operators_981129d7-9a49-4888-a19c-3c2924e854c8_0(e8a8071567baebf3d7a9dd2e3d30d9abd68a673a5ffc51f6bc40f2305ffe2525): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.828916 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-qsqps_openshift-operators(981129d7-9a49-4888-a19c-3c2924e854c8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-qsqps_openshift-operators(981129d7-9a49-4888-a19c-3c2924e854c8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-qsqps_openshift-operators_981129d7-9a49-4888-a19c-3c2924e854c8_0(e8a8071567baebf3d7a9dd2e3d30d9abd68a673a5ffc51f6bc40f2305ffe2525): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-qsqps" podUID="981129d7-9a49-4888-a19c-3c2924e854c8" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.837750 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators_661de35e-03be-4902-9273-ae4f7d165a16_0(519821dc250ce5ea42a92e68784dd357e47d1cbbb1fb628ef4b9ef6378d090c1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.837842 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators_661de35e-03be-4902-9273-ae4f7d165a16_0(519821dc250ce5ea42a92e68784dd357e47d1cbbb1fb628ef4b9ef6378d090c1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.837868 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators_661de35e-03be-4902-9273-ae4f7d165a16_0(519821dc250ce5ea42a92e68784dd357e47d1cbbb1fb628ef4b9ef6378d090c1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.837908 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators(661de35e-03be-4902-9273-ae4f7d165a16)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators(661de35e-03be-4902-9273-ae4f7d165a16)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-d4bvp_openshift-operators_661de35e-03be-4902-9273-ae4f7d165a16_0(519821dc250ce5ea42a92e68784dd357e47d1cbbb1fb628ef4b9ef6378d090c1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" podUID="661de35e-03be-4902-9273-ae4f7d165a16" Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.868135 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8"] Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.868269 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.868694 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.875891 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9"] Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.875983 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.876347 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.879826 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-5cd75"] Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.879924 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:50 crc kubenswrapper[4794]: I1215 14:06:50.880271 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.908869 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators_01c0c788-8deb-456f-b515-255500832030_0(525b9c35fe5e41bc9bc3ae8b034bbdec6fa80e77152d434babf8b8c0c0628535): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.908946 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators_01c0c788-8deb-456f-b515-255500832030_0(525b9c35fe5e41bc9bc3ae8b034bbdec6fa80e77152d434babf8b8c0c0628535): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.908971 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators_01c0c788-8deb-456f-b515-255500832030_0(525b9c35fe5e41bc9bc3ae8b034bbdec6fa80e77152d434babf8b8c0c0628535): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.909021 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators(01c0c788-8deb-456f-b515-255500832030)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators(01c0c788-8deb-456f-b515-255500832030)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_openshift-operators_01c0c788-8deb-456f-b515-255500832030_0(525b9c35fe5e41bc9bc3ae8b034bbdec6fa80e77152d434babf8b8c0c0628535): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" podUID="01c0c788-8deb-456f-b515-255500832030" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.924731 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators_097c120c-be56-4089-b752-36706c337bcf_0(65fea40322a7fc6b5c6a413bb82b4d589336cbba914a0d1741b8062f31b6ab70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.924795 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators_097c120c-be56-4089-b752-36706c337bcf_0(65fea40322a7fc6b5c6a413bb82b4d589336cbba914a0d1741b8062f31b6ab70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.924818 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators_097c120c-be56-4089-b752-36706c337bcf_0(65fea40322a7fc6b5c6a413bb82b4d589336cbba914a0d1741b8062f31b6ab70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.924862 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators(097c120c-be56-4089-b752-36706c337bcf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators(097c120c-be56-4089-b752-36706c337bcf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_openshift-operators_097c120c-be56-4089-b752-36706c337bcf_0(65fea40322a7fc6b5c6a413bb82b4d589336cbba914a0d1741b8062f31b6ab70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" podUID="097c120c-be56-4089-b752-36706c337bcf" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.938833 4794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5cd75_openshift-operators_f160af63-166e-45de-8a47-cf3fbda615ed_0(f922fda5df604fce9eea995ccc00bb5041b23a4f77319be3a9f95a5d1e20d450): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.938921 4794 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5cd75_openshift-operators_f160af63-166e-45de-8a47-cf3fbda615ed_0(f922fda5df604fce9eea995ccc00bb5041b23a4f77319be3a9f95a5d1e20d450): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.938946 4794 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5cd75_openshift-operators_f160af63-166e-45de-8a47-cf3fbda615ed_0(f922fda5df604fce9eea995ccc00bb5041b23a4f77319be3a9f95a5d1e20d450): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:06:50 crc kubenswrapper[4794]: E1215 14:06:50.938994 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-5cd75_openshift-operators(f160af63-166e-45de-8a47-cf3fbda615ed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-5cd75_openshift-operators(f160af63-166e-45de-8a47-cf3fbda615ed)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-5cd75_openshift-operators_f160af63-166e-45de-8a47-cf3fbda615ed_0(f922fda5df604fce9eea995ccc00bb5041b23a4f77319be3a9f95a5d1e20d450): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" podUID="f160af63-166e-45de-8a47-cf3fbda615ed" Dec 15 14:06:51 crc kubenswrapper[4794]: I1215 14:06:51.446715 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:51 crc kubenswrapper[4794]: I1215 14:06:51.446953 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:51 crc kubenswrapper[4794]: I1215 14:06:51.470371 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:51 crc kubenswrapper[4794]: I1215 14:06:51.498225 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" podStartSLOduration=9.498210869 podStartE2EDuration="9.498210869s" podCreationTimestamp="2025-12-15 14:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:06:51.494934397 +0000 UTC m=+773.346956835" watchObservedRunningTime="2025-12-15 14:06:51.498210869 +0000 UTC m=+773.350233307" Dec 15 14:06:52 crc kubenswrapper[4794]: I1215 14:06:52.455502 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:52 crc kubenswrapper[4794]: I1215 14:06:52.491553 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:06:54 crc kubenswrapper[4794]: I1215 14:06:54.534043 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:06:54 crc kubenswrapper[4794]: I1215 14:06:54.535206 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:06:54 crc kubenswrapper[4794]: I1215 14:06:54.535401 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 14:06:54 crc kubenswrapper[4794]: I1215 14:06:54.536366 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"abdd64a3133374cb12ad4cc9a75da62f63e41a60a16866f7f2bcba98242002ca"} pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 15 14:06:54 crc kubenswrapper[4794]: I1215 14:06:54.536613 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" containerID="cri-o://abdd64a3133374cb12ad4cc9a75da62f63e41a60a16866f7f2bcba98242002ca" gracePeriod=600 Dec 15 14:06:55 crc kubenswrapper[4794]: I1215 14:06:55.471562 4794 generic.go:334] "Generic (PLEG): container finished" podID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerID="abdd64a3133374cb12ad4cc9a75da62f63e41a60a16866f7f2bcba98242002ca" exitCode=0 Dec 15 14:06:55 crc kubenswrapper[4794]: I1215 14:06:55.471619 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerDied","Data":"abdd64a3133374cb12ad4cc9a75da62f63e41a60a16866f7f2bcba98242002ca"} Dec 15 14:06:55 crc kubenswrapper[4794]: I1215 14:06:55.471970 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"412df8a876437db1b8dca58aacf9cc551c4e0ee66a2d4593a2815bb14d689e05"} Dec 15 14:06:55 crc kubenswrapper[4794]: I1215 14:06:55.472005 4794 scope.go:117] "RemoveContainer" containerID="d174f397d9867d6a6419d5dbb9dbec64c765e9ea6a8a992f1d4851ec00451343" Dec 15 14:07:01 crc kubenswrapper[4794]: I1215 14:07:01.736657 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:07:01 crc kubenswrapper[4794]: I1215 14:07:01.738742 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" Dec 15 14:07:02 crc kubenswrapper[4794]: I1215 14:07:02.228366 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8"] Dec 15 14:07:02 crc kubenswrapper[4794]: W1215 14:07:02.239452 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod097c120c_be56_4089_b752_36706c337bcf.slice/crio-2c72d79d5bf2940dd46fa9203d77a5eabdbe2c4f6573e0da899204cc08066f13 WatchSource:0}: Error finding container 2c72d79d5bf2940dd46fa9203d77a5eabdbe2c4f6573e0da899204cc08066f13: Status 404 returned error can't find the container with id 2c72d79d5bf2940dd46fa9203d77a5eabdbe2c4f6573e0da899204cc08066f13 Dec 15 14:07:02 crc kubenswrapper[4794]: I1215 14:07:02.517773 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" event={"ID":"097c120c-be56-4089-b752-36706c337bcf","Type":"ContainerStarted","Data":"2c72d79d5bf2940dd46fa9203d77a5eabdbe2c4f6573e0da899204cc08066f13"} Dec 15 14:07:02 crc kubenswrapper[4794]: I1215 14:07:02.736449 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:07:02 crc kubenswrapper[4794]: I1215 14:07:02.736952 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" Dec 15 14:07:03 crc kubenswrapper[4794]: I1215 14:07:03.015236 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9"] Dec 15 14:07:03 crc kubenswrapper[4794]: W1215 14:07:03.033796 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01c0c788_8deb_456f_b515_255500832030.slice/crio-92dac1ca244ba768937b1b827a08da4dab7510b5450eef442977616e88da434b WatchSource:0}: Error finding container 92dac1ca244ba768937b1b827a08da4dab7510b5450eef442977616e88da434b: Status 404 returned error can't find the container with id 92dac1ca244ba768937b1b827a08da4dab7510b5450eef442977616e88da434b Dec 15 14:07:03 crc kubenswrapper[4794]: I1215 14:07:03.525858 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" event={"ID":"01c0c788-8deb-456f-b515-255500832030","Type":"ContainerStarted","Data":"92dac1ca244ba768937b1b827a08da4dab7510b5450eef442977616e88da434b"} Dec 15 14:07:04 crc kubenswrapper[4794]: I1215 14:07:03.736215 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:07:04 crc kubenswrapper[4794]: I1215 14:07:03.736329 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:07:04 crc kubenswrapper[4794]: I1215 14:07:03.736822 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" Dec 15 14:07:04 crc kubenswrapper[4794]: I1215 14:07:03.737000 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:07:04 crc kubenswrapper[4794]: W1215 14:07:04.677655 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod661de35e_03be_4902_9273_ae4f7d165a16.slice/crio-57cac162457fdaf375e5d39af06303f1d6d574c98999c18251b69cedbf7ea8ae WatchSource:0}: Error finding container 57cac162457fdaf375e5d39af06303f1d6d574c98999c18251b69cedbf7ea8ae: Status 404 returned error can't find the container with id 57cac162457fdaf375e5d39af06303f1d6d574c98999c18251b69cedbf7ea8ae Dec 15 14:07:04 crc kubenswrapper[4794]: I1215 14:07:04.686289 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp"] Dec 15 14:07:04 crc kubenswrapper[4794]: I1215 14:07:04.737755 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:07:04 crc kubenswrapper[4794]: I1215 14:07:04.738447 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:07:04 crc kubenswrapper[4794]: I1215 14:07:04.790944 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-5cd75"] Dec 15 14:07:05 crc kubenswrapper[4794]: I1215 14:07:05.058103 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-qsqps"] Dec 15 14:07:05 crc kubenswrapper[4794]: W1215 14:07:05.069178 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod981129d7_9a49_4888_a19c_3c2924e854c8.slice/crio-8864fc666c82cfa6f76310148021b0a814981916ab39bcbe45e2d39e09628057 WatchSource:0}: Error finding container 8864fc666c82cfa6f76310148021b0a814981916ab39bcbe45e2d39e09628057: Status 404 returned error can't find the container with id 8864fc666c82cfa6f76310148021b0a814981916ab39bcbe45e2d39e09628057 Dec 15 14:07:05 crc kubenswrapper[4794]: I1215 14:07:05.571505 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" event={"ID":"661de35e-03be-4902-9273-ae4f7d165a16","Type":"ContainerStarted","Data":"57cac162457fdaf375e5d39af06303f1d6d574c98999c18251b69cedbf7ea8ae"} Dec 15 14:07:05 crc kubenswrapper[4794]: I1215 14:07:05.596893 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-qsqps" event={"ID":"981129d7-9a49-4888-a19c-3c2924e854c8","Type":"ContainerStarted","Data":"8864fc666c82cfa6f76310148021b0a814981916ab39bcbe45e2d39e09628057"} Dec 15 14:07:05 crc kubenswrapper[4794]: I1215 14:07:05.601869 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" event={"ID":"f160af63-166e-45de-8a47-cf3fbda615ed","Type":"ContainerStarted","Data":"43bda3bbd8abb43d3e0ee9c590e56832d0e362f5856c4880a5182c2c9d0bbaa6"} Dec 15 14:07:12 crc kubenswrapper[4794]: I1215 14:07:12.810481 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tpc7b" Dec 15 14:07:18 crc kubenswrapper[4794]: I1215 14:07:18.748717 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:07:18 crc kubenswrapper[4794]: I1215 14:07:18.750398 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-qsqps" event={"ID":"981129d7-9a49-4888-a19c-3c2924e854c8","Type":"ContainerStarted","Data":"608ba90aab5c99512c4fb0b70b7d3ed934bc98731409d2deb422bb25f5d6de28"} Dec 15 14:07:18 crc kubenswrapper[4794]: I1215 14:07:18.750535 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" event={"ID":"f160af63-166e-45de-8a47-cf3fbda615ed","Type":"ContainerStarted","Data":"1e34877f7115a3682345fbacd120488d74f6516fbc96255aaae4c0d742077dba"} Dec 15 14:07:18 crc kubenswrapper[4794]: I1215 14:07:18.768842 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-qsqps" podStartSLOduration=16.458634222 podStartE2EDuration="29.768825575s" podCreationTimestamp="2025-12-15 14:06:49 +0000 UTC" firstStartedPulling="2025-12-15 14:07:05.072034924 +0000 UTC m=+786.924057362" lastFinishedPulling="2025-12-15 14:07:18.382226277 +0000 UTC m=+800.234248715" observedRunningTime="2025-12-15 14:07:18.766361885 +0000 UTC m=+800.618384353" watchObservedRunningTime="2025-12-15 14:07:18.768825575 +0000 UTC m=+800.620848013" Dec 15 14:07:18 crc kubenswrapper[4794]: I1215 14:07:18.787947 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" podStartSLOduration=17.172846793 podStartE2EDuration="30.787924875s" podCreationTimestamp="2025-12-15 14:06:48 +0000 UTC" firstStartedPulling="2025-12-15 14:07:04.813350371 +0000 UTC m=+786.665372809" lastFinishedPulling="2025-12-15 14:07:18.428428453 +0000 UTC m=+800.280450891" observedRunningTime="2025-12-15 14:07:18.785216288 +0000 UTC m=+800.637238726" watchObservedRunningTime="2025-12-15 14:07:18.787924875 +0000 UTC m=+800.639947343" Dec 15 14:07:19 crc kubenswrapper[4794]: I1215 14:07:19.283958 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:07:19 crc kubenswrapper[4794]: I1215 14:07:19.287020 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-5cd75" Dec 15 14:07:19 crc kubenswrapper[4794]: I1215 14:07:19.744814 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" event={"ID":"097c120c-be56-4089-b752-36706c337bcf","Type":"ContainerStarted","Data":"961093b0e04785192a45307a3ff9029ef556abef9d16d6687d21721e12a99760"} Dec 15 14:07:19 crc kubenswrapper[4794]: I1215 14:07:19.746453 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" event={"ID":"661de35e-03be-4902-9273-ae4f7d165a16","Type":"ContainerStarted","Data":"38df5d8647eb44bb5ee100dd46afba59fe2c05ffd594f6ab0c094fc3db9bd630"} Dec 15 14:07:19 crc kubenswrapper[4794]: I1215 14:07:19.748076 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" event={"ID":"01c0c788-8deb-456f-b515-255500832030","Type":"ContainerStarted","Data":"f0635313a02b8d107fe1f7b3674c99e4cc78c24c675ed1de5f3295ea6abafb18"} Dec 15 14:07:19 crc kubenswrapper[4794]: I1215 14:07:19.768572 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8" podStartSLOduration=15.65223431 podStartE2EDuration="31.768540542s" podCreationTimestamp="2025-12-15 14:06:48 +0000 UTC" firstStartedPulling="2025-12-15 14:07:02.240938169 +0000 UTC m=+784.092960607" lastFinishedPulling="2025-12-15 14:07:18.357244381 +0000 UTC m=+800.209266839" observedRunningTime="2025-12-15 14:07:19.767560465 +0000 UTC m=+801.619582923" watchObservedRunningTime="2025-12-15 14:07:19.768540542 +0000 UTC m=+801.620563000" Dec 15 14:07:19 crc kubenswrapper[4794]: I1215 14:07:19.797871 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-d4bvp" podStartSLOduration=18.083078173 podStartE2EDuration="31.797849831s" podCreationTimestamp="2025-12-15 14:06:48 +0000 UTC" firstStartedPulling="2025-12-15 14:07:04.684190632 +0000 UTC m=+786.536213080" lastFinishedPulling="2025-12-15 14:07:18.3989623 +0000 UTC m=+800.250984738" observedRunningTime="2025-12-15 14:07:19.795052702 +0000 UTC m=+801.647075200" watchObservedRunningTime="2025-12-15 14:07:19.797849831 +0000 UTC m=+801.649872269" Dec 15 14:07:19 crc kubenswrapper[4794]: I1215 14:07:19.820482 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9" podStartSLOduration=16.485604817 podStartE2EDuration="31.8204641s" podCreationTimestamp="2025-12-15 14:06:48 +0000 UTC" firstStartedPulling="2025-12-15 14:07:03.036591759 +0000 UTC m=+784.888614187" lastFinishedPulling="2025-12-15 14:07:18.371450992 +0000 UTC m=+800.223473470" observedRunningTime="2025-12-15 14:07:19.817793405 +0000 UTC m=+801.669815853" watchObservedRunningTime="2025-12-15 14:07:19.8204641 +0000 UTC m=+801.672486538" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.464943 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d"] Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.467854 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.474480 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.490358 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d"] Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.531378 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.531479 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.531527 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9rxx\" (UniqueName: \"kubernetes.io/projected/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-kube-api-access-p9rxx\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.633079 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.633167 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.633203 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9rxx\" (UniqueName: \"kubernetes.io/projected/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-kube-api-access-p9rxx\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.633850 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.633865 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.658842 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9rxx\" (UniqueName: \"kubernetes.io/projected/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-kube-api-access-p9rxx\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:24 crc kubenswrapper[4794]: I1215 14:07:24.829723 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:25 crc kubenswrapper[4794]: I1215 14:07:25.045222 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d"] Dec 15 14:07:25 crc kubenswrapper[4794]: I1215 14:07:25.784554 4794 generic.go:334] "Generic (PLEG): container finished" podID="81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" containerID="7c698c3b56c763402e680a4f2163cc40ea8904635d2b890d803293f4cc103f22" exitCode=0 Dec 15 14:07:25 crc kubenswrapper[4794]: I1215 14:07:25.784620 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" event={"ID":"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46","Type":"ContainerDied","Data":"7c698c3b56c763402e680a4f2163cc40ea8904635d2b890d803293f4cc103f22"} Dec 15 14:07:25 crc kubenswrapper[4794]: I1215 14:07:25.784650 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" event={"ID":"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46","Type":"ContainerStarted","Data":"41aa4b3a1c1b32c4fca984469c2b16645c7a1c6cd115c56093dd54bc32bd0166"} Dec 15 14:07:26 crc kubenswrapper[4794]: I1215 14:07:26.833631 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r9kf7"] Dec 15 14:07:26 crc kubenswrapper[4794]: I1215 14:07:26.835189 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:26 crc kubenswrapper[4794]: I1215 14:07:26.856338 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r9kf7"] Dec 15 14:07:26 crc kubenswrapper[4794]: I1215 14:07:26.963782 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbzq8\" (UniqueName: \"kubernetes.io/projected/56970bfd-3919-4105-8711-20282b5831b9-kube-api-access-cbzq8\") pod \"redhat-operators-r9kf7\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:26 crc kubenswrapper[4794]: I1215 14:07:26.963950 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-utilities\") pod \"redhat-operators-r9kf7\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:26 crc kubenswrapper[4794]: I1215 14:07:26.964021 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-catalog-content\") pod \"redhat-operators-r9kf7\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.065259 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-utilities\") pod \"redhat-operators-r9kf7\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.065318 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-catalog-content\") pod \"redhat-operators-r9kf7\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.065354 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbzq8\" (UniqueName: \"kubernetes.io/projected/56970bfd-3919-4105-8711-20282b5831b9-kube-api-access-cbzq8\") pod \"redhat-operators-r9kf7\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.065870 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-utilities\") pod \"redhat-operators-r9kf7\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.065934 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-catalog-content\") pod \"redhat-operators-r9kf7\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.094885 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbzq8\" (UniqueName: \"kubernetes.io/projected/56970bfd-3919-4105-8711-20282b5831b9-kube-api-access-cbzq8\") pod \"redhat-operators-r9kf7\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.152059 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.387144 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r9kf7"] Dec 15 14:07:27 crc kubenswrapper[4794]: W1215 14:07:27.399164 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56970bfd_3919_4105_8711_20282b5831b9.slice/crio-ab501301743e6f8ca365a42f7104a18d4e312b327d5434eeca6990fb4b319bad WatchSource:0}: Error finding container ab501301743e6f8ca365a42f7104a18d4e312b327d5434eeca6990fb4b319bad: Status 404 returned error can't find the container with id ab501301743e6f8ca365a42f7104a18d4e312b327d5434eeca6990fb4b319bad Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.796231 4794 generic.go:334] "Generic (PLEG): container finished" podID="56970bfd-3919-4105-8711-20282b5831b9" containerID="493c09dcfdf0f88057144725fdecd612cc08e19c395f296108a06f9be49ca814" exitCode=0 Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.796529 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9kf7" event={"ID":"56970bfd-3919-4105-8711-20282b5831b9","Type":"ContainerDied","Data":"493c09dcfdf0f88057144725fdecd612cc08e19c395f296108a06f9be49ca814"} Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.796706 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9kf7" event={"ID":"56970bfd-3919-4105-8711-20282b5831b9","Type":"ContainerStarted","Data":"ab501301743e6f8ca365a42f7104a18d4e312b327d5434eeca6990fb4b319bad"} Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.799294 4794 generic.go:334] "Generic (PLEG): container finished" podID="81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" containerID="8c1852607340daff6dec6d3ae72d99a0b01b5150da1e3e0ebc8c50a5dba03e07" exitCode=0 Dec 15 14:07:27 crc kubenswrapper[4794]: I1215 14:07:27.799324 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" event={"ID":"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46","Type":"ContainerDied","Data":"8c1852607340daff6dec6d3ae72d99a0b01b5150da1e3e0ebc8c50a5dba03e07"} Dec 15 14:07:28 crc kubenswrapper[4794]: I1215 14:07:28.808655 4794 generic.go:334] "Generic (PLEG): container finished" podID="81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" containerID="4df74002f01967909ce5c4eac59bb6cf2219ef0cb8185f0c86de9f7c83bdbe34" exitCode=0 Dec 15 14:07:28 crc kubenswrapper[4794]: I1215 14:07:28.808714 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" event={"ID":"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46","Type":"ContainerDied","Data":"4df74002f01967909ce5c4eac59bb6cf2219ef0cb8185f0c86de9f7c83bdbe34"} Dec 15 14:07:29 crc kubenswrapper[4794]: I1215 14:07:29.409699 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-qsqps" Dec 15 14:07:29 crc kubenswrapper[4794]: I1215 14:07:29.820505 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9kf7" event={"ID":"56970bfd-3919-4105-8711-20282b5831b9","Type":"ContainerStarted","Data":"ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b"} Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.184414 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.326705 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9rxx\" (UniqueName: \"kubernetes.io/projected/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-kube-api-access-p9rxx\") pod \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.326773 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-util\") pod \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.326836 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-bundle\") pod \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\" (UID: \"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46\") " Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.327413 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-bundle" (OuterVolumeSpecName: "bundle") pod "81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" (UID: "81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.331772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-kube-api-access-p9rxx" (OuterVolumeSpecName: "kube-api-access-p9rxx") pod "81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" (UID: "81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46"). InnerVolumeSpecName "kube-api-access-p9rxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.344005 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-util" (OuterVolumeSpecName: "util") pod "81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" (UID: "81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.428535 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9rxx\" (UniqueName: \"kubernetes.io/projected/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-kube-api-access-p9rxx\") on node \"crc\" DevicePath \"\"" Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.428569 4794 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-util\") on node \"crc\" DevicePath \"\"" Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.428590 4794 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.827429 4794 generic.go:334] "Generic (PLEG): container finished" podID="56970bfd-3919-4105-8711-20282b5831b9" containerID="ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b" exitCode=0 Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.827503 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9kf7" event={"ID":"56970bfd-3919-4105-8711-20282b5831b9","Type":"ContainerDied","Data":"ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b"} Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.829595 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" event={"ID":"81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46","Type":"ContainerDied","Data":"41aa4b3a1c1b32c4fca984469c2b16645c7a1c6cd115c56093dd54bc32bd0166"} Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.829627 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41aa4b3a1c1b32c4fca984469c2b16645c7a1c6cd115c56093dd54bc32bd0166" Dec 15 14:07:30 crc kubenswrapper[4794]: I1215 14:07:30.829686 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d" Dec 15 14:07:31 crc kubenswrapper[4794]: I1215 14:07:31.836545 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9kf7" event={"ID":"56970bfd-3919-4105-8711-20282b5831b9","Type":"ContainerStarted","Data":"89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437"} Dec 15 14:07:31 crc kubenswrapper[4794]: I1215 14:07:31.851845 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r9kf7" podStartSLOduration=2.362491735 podStartE2EDuration="5.851827286s" podCreationTimestamp="2025-12-15 14:07:26 +0000 UTC" firstStartedPulling="2025-12-15 14:07:27.798216744 +0000 UTC m=+809.650239182" lastFinishedPulling="2025-12-15 14:07:31.287552295 +0000 UTC m=+813.139574733" observedRunningTime="2025-12-15 14:07:31.850831368 +0000 UTC m=+813.702853836" watchObservedRunningTime="2025-12-15 14:07:31.851827286 +0000 UTC m=+813.703849724" Dec 15 14:07:33 crc kubenswrapper[4794]: I1215 14:07:33.960192 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-7mjgk"] Dec 15 14:07:33 crc kubenswrapper[4794]: E1215 14:07:33.961929 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" containerName="pull" Dec 15 14:07:33 crc kubenswrapper[4794]: I1215 14:07:33.962046 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" containerName="pull" Dec 15 14:07:33 crc kubenswrapper[4794]: E1215 14:07:33.962130 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" containerName="util" Dec 15 14:07:33 crc kubenswrapper[4794]: I1215 14:07:33.962202 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" containerName="util" Dec 15 14:07:33 crc kubenswrapper[4794]: E1215 14:07:33.962284 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" containerName="extract" Dec 15 14:07:33 crc kubenswrapper[4794]: I1215 14:07:33.962356 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" containerName="extract" Dec 15 14:07:33 crc kubenswrapper[4794]: I1215 14:07:33.962550 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46" containerName="extract" Dec 15 14:07:33 crc kubenswrapper[4794]: I1215 14:07:33.963377 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-7mjgk" Dec 15 14:07:33 crc kubenswrapper[4794]: I1215 14:07:33.965888 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-mfhf9" Dec 15 14:07:33 crc kubenswrapper[4794]: I1215 14:07:33.966576 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 15 14:07:33 crc kubenswrapper[4794]: I1215 14:07:33.966735 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 15 14:07:33 crc kubenswrapper[4794]: I1215 14:07:33.969127 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-7mjgk"] Dec 15 14:07:34 crc kubenswrapper[4794]: I1215 14:07:34.081714 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62p2l\" (UniqueName: \"kubernetes.io/projected/ed156fbf-3a73-46b0-9f8c-6f233151f987-kube-api-access-62p2l\") pod \"nmstate-operator-6769fb99d-7mjgk\" (UID: \"ed156fbf-3a73-46b0-9f8c-6f233151f987\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-7mjgk" Dec 15 14:07:34 crc kubenswrapper[4794]: I1215 14:07:34.183012 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62p2l\" (UniqueName: \"kubernetes.io/projected/ed156fbf-3a73-46b0-9f8c-6f233151f987-kube-api-access-62p2l\") pod \"nmstate-operator-6769fb99d-7mjgk\" (UID: \"ed156fbf-3a73-46b0-9f8c-6f233151f987\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-7mjgk" Dec 15 14:07:34 crc kubenswrapper[4794]: I1215 14:07:34.206982 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62p2l\" (UniqueName: \"kubernetes.io/projected/ed156fbf-3a73-46b0-9f8c-6f233151f987-kube-api-access-62p2l\") pod \"nmstate-operator-6769fb99d-7mjgk\" (UID: \"ed156fbf-3a73-46b0-9f8c-6f233151f987\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-7mjgk" Dec 15 14:07:34 crc kubenswrapper[4794]: I1215 14:07:34.281191 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-7mjgk" Dec 15 14:07:34 crc kubenswrapper[4794]: I1215 14:07:34.549223 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-7mjgk"] Dec 15 14:07:34 crc kubenswrapper[4794]: W1215 14:07:34.554552 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded156fbf_3a73_46b0_9f8c_6f233151f987.slice/crio-5c8919415e03cab2b919357dc9519bc1108ffd12f174c62877bb0f167997d2cd WatchSource:0}: Error finding container 5c8919415e03cab2b919357dc9519bc1108ffd12f174c62877bb0f167997d2cd: Status 404 returned error can't find the container with id 5c8919415e03cab2b919357dc9519bc1108ffd12f174c62877bb0f167997d2cd Dec 15 14:07:34 crc kubenswrapper[4794]: I1215 14:07:34.853008 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-7mjgk" event={"ID":"ed156fbf-3a73-46b0-9f8c-6f233151f987","Type":"ContainerStarted","Data":"5c8919415e03cab2b919357dc9519bc1108ffd12f174c62877bb0f167997d2cd"} Dec 15 14:07:37 crc kubenswrapper[4794]: I1215 14:07:37.152456 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:37 crc kubenswrapper[4794]: I1215 14:07:37.152528 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:37 crc kubenswrapper[4794]: I1215 14:07:37.205909 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:37 crc kubenswrapper[4794]: I1215 14:07:37.920480 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:39 crc kubenswrapper[4794]: I1215 14:07:39.622482 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r9kf7"] Dec 15 14:07:39 crc kubenswrapper[4794]: I1215 14:07:39.898422 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r9kf7" podUID="56970bfd-3919-4105-8711-20282b5831b9" containerName="registry-server" containerID="cri-o://89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437" gracePeriod=2 Dec 15 14:07:39 crc kubenswrapper[4794]: I1215 14:07:39.899089 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-7mjgk" event={"ID":"ed156fbf-3a73-46b0-9f8c-6f233151f987","Type":"ContainerStarted","Data":"fa6687a96196fc2c75347973184246ba45564534871b2a5a9489546baa0f55f6"} Dec 15 14:07:39 crc kubenswrapper[4794]: I1215 14:07:39.928991 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-7mjgk" podStartSLOduration=2.138685113 podStartE2EDuration="6.928966388s" podCreationTimestamp="2025-12-15 14:07:33 +0000 UTC" firstStartedPulling="2025-12-15 14:07:34.55661547 +0000 UTC m=+816.408637908" lastFinishedPulling="2025-12-15 14:07:39.346896745 +0000 UTC m=+821.198919183" observedRunningTime="2025-12-15 14:07:39.921324552 +0000 UTC m=+821.773347080" watchObservedRunningTime="2025-12-15 14:07:39.928966388 +0000 UTC m=+821.780988856" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.262564 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.404215 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-catalog-content\") pod \"56970bfd-3919-4105-8711-20282b5831b9\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.404353 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-utilities\") pod \"56970bfd-3919-4105-8711-20282b5831b9\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.404392 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbzq8\" (UniqueName: \"kubernetes.io/projected/56970bfd-3919-4105-8711-20282b5831b9-kube-api-access-cbzq8\") pod \"56970bfd-3919-4105-8711-20282b5831b9\" (UID: \"56970bfd-3919-4105-8711-20282b5831b9\") " Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.405362 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-utilities" (OuterVolumeSpecName: "utilities") pod "56970bfd-3919-4105-8711-20282b5831b9" (UID: "56970bfd-3919-4105-8711-20282b5831b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.405624 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.410453 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56970bfd-3919-4105-8711-20282b5831b9-kube-api-access-cbzq8" (OuterVolumeSpecName: "kube-api-access-cbzq8") pod "56970bfd-3919-4105-8711-20282b5831b9" (UID: "56970bfd-3919-4105-8711-20282b5831b9"). InnerVolumeSpecName "kube-api-access-cbzq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.507026 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbzq8\" (UniqueName: \"kubernetes.io/projected/56970bfd-3919-4105-8711-20282b5831b9-kube-api-access-cbzq8\") on node \"crc\" DevicePath \"\"" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.905137 4794 generic.go:334] "Generic (PLEG): container finished" podID="56970bfd-3919-4105-8711-20282b5831b9" containerID="89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437" exitCode=0 Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.905757 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r9kf7" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.906110 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9kf7" event={"ID":"56970bfd-3919-4105-8711-20282b5831b9","Type":"ContainerDied","Data":"89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437"} Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.906143 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9kf7" event={"ID":"56970bfd-3919-4105-8711-20282b5831b9","Type":"ContainerDied","Data":"ab501301743e6f8ca365a42f7104a18d4e312b327d5434eeca6990fb4b319bad"} Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.906163 4794 scope.go:117] "RemoveContainer" containerID="89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.922978 4794 scope.go:117] "RemoveContainer" containerID="ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.941453 4794 scope.go:117] "RemoveContainer" containerID="493c09dcfdf0f88057144725fdecd612cc08e19c395f296108a06f9be49ca814" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.960783 4794 scope.go:117] "RemoveContainer" containerID="89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437" Dec 15 14:07:40 crc kubenswrapper[4794]: E1215 14:07:40.961378 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437\": container with ID starting with 89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437 not found: ID does not exist" containerID="89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.961473 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437"} err="failed to get container status \"89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437\": rpc error: code = NotFound desc = could not find container \"89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437\": container with ID starting with 89e1998f40b483bf50b0276da7455cbfb08a6bc1f806e7d90e9480812c7ca437 not found: ID does not exist" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.961506 4794 scope.go:117] "RemoveContainer" containerID="ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b" Dec 15 14:07:40 crc kubenswrapper[4794]: E1215 14:07:40.961906 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b\": container with ID starting with ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b not found: ID does not exist" containerID="ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.961944 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b"} err="failed to get container status \"ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b\": rpc error: code = NotFound desc = could not find container \"ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b\": container with ID starting with ece3cdc977953d182f6dcde56d0de432933d12fae242cc890e47529b948dac0b not found: ID does not exist" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.961970 4794 scope.go:117] "RemoveContainer" containerID="493c09dcfdf0f88057144725fdecd612cc08e19c395f296108a06f9be49ca814" Dec 15 14:07:40 crc kubenswrapper[4794]: E1215 14:07:40.962339 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"493c09dcfdf0f88057144725fdecd612cc08e19c395f296108a06f9be49ca814\": container with ID starting with 493c09dcfdf0f88057144725fdecd612cc08e19c395f296108a06f9be49ca814 not found: ID does not exist" containerID="493c09dcfdf0f88057144725fdecd612cc08e19c395f296108a06f9be49ca814" Dec 15 14:07:40 crc kubenswrapper[4794]: I1215 14:07:40.962365 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"493c09dcfdf0f88057144725fdecd612cc08e19c395f296108a06f9be49ca814"} err="failed to get container status \"493c09dcfdf0f88057144725fdecd612cc08e19c395f296108a06f9be49ca814\": rpc error: code = NotFound desc = could not find container \"493c09dcfdf0f88057144725fdecd612cc08e19c395f296108a06f9be49ca814\": container with ID starting with 493c09dcfdf0f88057144725fdecd612cc08e19c395f296108a06f9be49ca814 not found: ID does not exist" Dec 15 14:07:41 crc kubenswrapper[4794]: I1215 14:07:41.451433 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56970bfd-3919-4105-8711-20282b5831b9" (UID: "56970bfd-3919-4105-8711-20282b5831b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:07:41 crc kubenswrapper[4794]: I1215 14:07:41.520798 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56970bfd-3919-4105-8711-20282b5831b9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:07:41 crc kubenswrapper[4794]: I1215 14:07:41.547354 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r9kf7"] Dec 15 14:07:41 crc kubenswrapper[4794]: I1215 14:07:41.556290 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r9kf7"] Dec 15 14:07:42 crc kubenswrapper[4794]: I1215 14:07:42.747538 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56970bfd-3919-4105-8711-20282b5831b9" path="/var/lib/kubelet/pods/56970bfd-3919-4105-8711-20282b5831b9/volumes" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.660048 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd"] Dec 15 14:07:46 crc kubenswrapper[4794]: E1215 14:07:46.661290 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56970bfd-3919-4105-8711-20282b5831b9" containerName="extract-utilities" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.661402 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="56970bfd-3919-4105-8711-20282b5831b9" containerName="extract-utilities" Dec 15 14:07:46 crc kubenswrapper[4794]: E1215 14:07:46.661487 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56970bfd-3919-4105-8711-20282b5831b9" containerName="extract-content" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.661557 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="56970bfd-3919-4105-8711-20282b5831b9" containerName="extract-content" Dec 15 14:07:46 crc kubenswrapper[4794]: E1215 14:07:46.661667 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56970bfd-3919-4105-8711-20282b5831b9" containerName="registry-server" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.661740 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="56970bfd-3919-4105-8711-20282b5831b9" containerName="registry-server" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.661926 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="56970bfd-3919-4105-8711-20282b5831b9" containerName="registry-server" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.662705 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.666203 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-6ttr6" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.672334 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-cblrd"] Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.674024 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.677635 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.697110 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd"] Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.717086 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-cblrd"] Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.745020 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-p8r8k"] Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.745832 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.789070 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmpjg\" (UniqueName: \"kubernetes.io/projected/32475a8d-44fa-4fd8-9b4a-e907db6bcd4e-kube-api-access-kmpjg\") pod \"nmstate-webhook-f8fb84555-cblrd\" (UID: \"32475a8d-44fa-4fd8-9b4a-e907db6bcd4e\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.789173 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/32475a8d-44fa-4fd8-9b4a-e907db6bcd4e-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-cblrd\" (UID: \"32475a8d-44fa-4fd8-9b4a-e907db6bcd4e\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.789293 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxg5b\" (UniqueName: \"kubernetes.io/projected/1ae0e754-4486-4c14-b87d-df2cfc6a94dd-kube-api-access-lxg5b\") pod \"nmstate-metrics-7f7f7578db-q2rhd\" (UID: \"1ae0e754-4486-4c14-b87d-df2cfc6a94dd\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.795670 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr"] Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.798852 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.802214 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.802232 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-kxfvz" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.802266 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.809711 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr"] Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.890404 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/645d79e0-7c57-4dee-8065-beffdff79fa2-nmstate-lock\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.890497 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpjg\" (UniqueName: \"kubernetes.io/projected/32475a8d-44fa-4fd8-9b4a-e907db6bcd4e-kube-api-access-kmpjg\") pod \"nmstate-webhook-f8fb84555-cblrd\" (UID: \"32475a8d-44fa-4fd8-9b4a-e907db6bcd4e\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.890524 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-m6bgr\" (UID: \"b9245ea6-901c-4e5a-a3be-d7184ace8e8c\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.890542 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-m6bgr\" (UID: \"b9245ea6-901c-4e5a-a3be-d7184ace8e8c\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.890644 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/645d79e0-7c57-4dee-8065-beffdff79fa2-ovs-socket\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.890742 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f78k\" (UniqueName: \"kubernetes.io/projected/645d79e0-7c57-4dee-8065-beffdff79fa2-kube-api-access-6f78k\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.890760 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/32475a8d-44fa-4fd8-9b4a-e907db6bcd4e-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-cblrd\" (UID: \"32475a8d-44fa-4fd8-9b4a-e907db6bcd4e\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.890791 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/645d79e0-7c57-4dee-8065-beffdff79fa2-dbus-socket\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.890833 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxg5b\" (UniqueName: \"kubernetes.io/projected/1ae0e754-4486-4c14-b87d-df2cfc6a94dd-kube-api-access-lxg5b\") pod \"nmstate-metrics-7f7f7578db-q2rhd\" (UID: \"1ae0e754-4486-4c14-b87d-df2cfc6a94dd\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.890863 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgj2s\" (UniqueName: \"kubernetes.io/projected/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-kube-api-access-tgj2s\") pod \"nmstate-console-plugin-6ff7998486-m6bgr\" (UID: \"b9245ea6-901c-4e5a-a3be-d7184ace8e8c\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.897656 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/32475a8d-44fa-4fd8-9b4a-e907db6bcd4e-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-cblrd\" (UID: \"32475a8d-44fa-4fd8-9b4a-e907db6bcd4e\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.906508 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxg5b\" (UniqueName: \"kubernetes.io/projected/1ae0e754-4486-4c14-b87d-df2cfc6a94dd-kube-api-access-lxg5b\") pod \"nmstate-metrics-7f7f7578db-q2rhd\" (UID: \"1ae0e754-4486-4c14-b87d-df2cfc6a94dd\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.926272 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpjg\" (UniqueName: \"kubernetes.io/projected/32475a8d-44fa-4fd8-9b4a-e907db6bcd4e-kube-api-access-kmpjg\") pod \"nmstate-webhook-f8fb84555-cblrd\" (UID: \"32475a8d-44fa-4fd8-9b4a-e907db6bcd4e\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.991934 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f78k\" (UniqueName: \"kubernetes.io/projected/645d79e0-7c57-4dee-8065-beffdff79fa2-kube-api-access-6f78k\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.992005 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/645d79e0-7c57-4dee-8065-beffdff79fa2-dbus-socket\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.992058 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgj2s\" (UniqueName: \"kubernetes.io/projected/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-kube-api-access-tgj2s\") pod \"nmstate-console-plugin-6ff7998486-m6bgr\" (UID: \"b9245ea6-901c-4e5a-a3be-d7184ace8e8c\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.992101 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/645d79e0-7c57-4dee-8065-beffdff79fa2-nmstate-lock\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.992162 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-m6bgr\" (UID: \"b9245ea6-901c-4e5a-a3be-d7184ace8e8c\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.992187 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-m6bgr\" (UID: \"b9245ea6-901c-4e5a-a3be-d7184ace8e8c\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.992236 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/645d79e0-7c57-4dee-8065-beffdff79fa2-ovs-socket\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.992329 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/645d79e0-7c57-4dee-8065-beffdff79fa2-ovs-socket\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.992335 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/645d79e0-7c57-4dee-8065-beffdff79fa2-dbus-socket\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.992390 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/645d79e0-7c57-4dee-8065-beffdff79fa2-nmstate-lock\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:46 crc kubenswrapper[4794]: E1215 14:07:46.992477 4794 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 15 14:07:46 crc kubenswrapper[4794]: E1215 14:07:46.992554 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-plugin-serving-cert podName:b9245ea6-901c-4e5a-a3be-d7184ace8e8c nodeName:}" failed. No retries permitted until 2025-12-15 14:07:47.49253025 +0000 UTC m=+829.344552708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-plugin-serving-cert") pod "nmstate-console-plugin-6ff7998486-m6bgr" (UID: "b9245ea6-901c-4e5a-a3be-d7184ace8e8c") : secret "plugin-serving-cert" not found Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.993110 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-m6bgr\" (UID: \"b9245ea6-901c-4e5a-a3be-d7184ace8e8c\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:46 crc kubenswrapper[4794]: I1215 14:07:46.997496 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.008858 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.025284 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgj2s\" (UniqueName: \"kubernetes.io/projected/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-kube-api-access-tgj2s\") pod \"nmstate-console-plugin-6ff7998486-m6bgr\" (UID: \"b9245ea6-901c-4e5a-a3be-d7184ace8e8c\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.026322 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f78k\" (UniqueName: \"kubernetes.io/projected/645d79e0-7c57-4dee-8065-beffdff79fa2-kube-api-access-6f78k\") pod \"nmstate-handler-p8r8k\" (UID: \"645d79e0-7c57-4dee-8065-beffdff79fa2\") " pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.041408 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-548dc59476-42nnj"] Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.042181 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.044914 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-548dc59476-42nnj"] Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.063770 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:47 crc kubenswrapper[4794]: W1215 14:07:47.090331 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod645d79e0_7c57_4dee_8065_beffdff79fa2.slice/crio-56dd902c5e3bbc09dceb649253b276a0ad13fd929ebb680d47ec50d90cc61dcd WatchSource:0}: Error finding container 56dd902c5e3bbc09dceb649253b276a0ad13fd929ebb680d47ec50d90cc61dcd: Status 404 returned error can't find the container with id 56dd902c5e3bbc09dceb649253b276a0ad13fd929ebb680d47ec50d90cc61dcd Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.194366 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-serving-cert\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.194418 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-console-config\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.194443 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk868\" (UniqueName: \"kubernetes.io/projected/1f763743-6d60-4726-aa92-05e1e720e035-kube-api-access-wk868\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.194466 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-oauth-config\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.194508 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-trusted-ca-bundle\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.194543 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-oauth-serving-cert\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.194566 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-service-ca\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.295893 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-console-config\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.295947 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk868\" (UniqueName: \"kubernetes.io/projected/1f763743-6d60-4726-aa92-05e1e720e035-kube-api-access-wk868\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.295975 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-oauth-config\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.296024 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-trusted-ca-bundle\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.296064 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-oauth-serving-cert\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.296093 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-service-ca\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.296156 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-serving-cert\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.297656 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-console-config\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.298320 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-trusted-ca-bundle\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.300260 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-oauth-serving-cert\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.300806 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-service-ca\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.306012 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-serving-cert\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.310200 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-oauth-config\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.323496 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk868\" (UniqueName: \"kubernetes.io/projected/1f763743-6d60-4726-aa92-05e1e720e035-kube-api-access-wk868\") pod \"console-548dc59476-42nnj\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.344125 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd"] Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.400542 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.498574 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-m6bgr\" (UID: \"b9245ea6-901c-4e5a-a3be-d7184ace8e8c\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.501333 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9245ea6-901c-4e5a-a3be-d7184ace8e8c-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-m6bgr\" (UID: \"b9245ea6-901c-4e5a-a3be-d7184ace8e8c\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.565199 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-548dc59476-42nnj"] Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.592646 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-cblrd"] Dec 15 14:07:47 crc kubenswrapper[4794]: W1215 14:07:47.599884 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32475a8d_44fa_4fd8_9b4a_e907db6bcd4e.slice/crio-db233e2377a0fcb10ea1fe67a85237160610c9d766e0451af78278500694291a WatchSource:0}: Error finding container db233e2377a0fcb10ea1fe67a85237160610c9d766e0451af78278500694291a: Status 404 returned error can't find the container with id db233e2377a0fcb10ea1fe67a85237160610c9d766e0451af78278500694291a Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.713439 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.956375 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" event={"ID":"32475a8d-44fa-4fd8-9b4a-e907db6bcd4e","Type":"ContainerStarted","Data":"db233e2377a0fcb10ea1fe67a85237160610c9d766e0451af78278500694291a"} Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.958193 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-p8r8k" event={"ID":"645d79e0-7c57-4dee-8065-beffdff79fa2","Type":"ContainerStarted","Data":"56dd902c5e3bbc09dceb649253b276a0ad13fd929ebb680d47ec50d90cc61dcd"} Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.959791 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-548dc59476-42nnj" event={"ID":"1f763743-6d60-4726-aa92-05e1e720e035","Type":"ContainerStarted","Data":"76206210b7ae086e78dd3f7ac62b4b14196bcf5773e40f5201e81322f508f505"} Dec 15 14:07:47 crc kubenswrapper[4794]: I1215 14:07:47.960547 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd" event={"ID":"1ae0e754-4486-4c14-b87d-df2cfc6a94dd","Type":"ContainerStarted","Data":"b65713aa43aa677d3378d3104f1878a4df5d617d71b0bead89bb0d7c9e360527"} Dec 15 14:07:48 crc kubenswrapper[4794]: I1215 14:07:48.125154 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr"] Dec 15 14:07:48 crc kubenswrapper[4794]: W1215 14:07:48.190354 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9245ea6_901c_4e5a_a3be_d7184ace8e8c.slice/crio-55b62091b1971b83364c1393c9b9b9c4bbb4c17b4b671a59e4f50c5fd7ed4fa1 WatchSource:0}: Error finding container 55b62091b1971b83364c1393c9b9b9c4bbb4c17b4b671a59e4f50c5fd7ed4fa1: Status 404 returned error can't find the container with id 55b62091b1971b83364c1393c9b9b9c4bbb4c17b4b671a59e4f50c5fd7ed4fa1 Dec 15 14:07:48 crc kubenswrapper[4794]: I1215 14:07:48.971109 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" event={"ID":"b9245ea6-901c-4e5a-a3be-d7184ace8e8c","Type":"ContainerStarted","Data":"55b62091b1971b83364c1393c9b9b9c4bbb4c17b4b671a59e4f50c5fd7ed4fa1"} Dec 15 14:07:51 crc kubenswrapper[4794]: I1215 14:07:51.003115 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-548dc59476-42nnj" event={"ID":"1f763743-6d60-4726-aa92-05e1e720e035","Type":"ContainerStarted","Data":"d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13"} Dec 15 14:07:51 crc kubenswrapper[4794]: I1215 14:07:51.027677 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-548dc59476-42nnj" podStartSLOduration=5.02765655 podStartE2EDuration="5.02765655s" podCreationTimestamp="2025-12-15 14:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:07:51.022908435 +0000 UTC m=+832.874930883" watchObservedRunningTime="2025-12-15 14:07:51.02765655 +0000 UTC m=+832.879678998" Dec 15 14:07:53 crc kubenswrapper[4794]: I1215 14:07:53.019231 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-p8r8k" event={"ID":"645d79e0-7c57-4dee-8065-beffdff79fa2","Type":"ContainerStarted","Data":"c1594bfbdcebe814280a2579fc46aa23cd22c6f7f86f64f4748e7e739df17da8"} Dec 15 14:07:53 crc kubenswrapper[4794]: I1215 14:07:53.019980 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:53 crc kubenswrapper[4794]: I1215 14:07:53.024476 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd" event={"ID":"1ae0e754-4486-4c14-b87d-df2cfc6a94dd","Type":"ContainerStarted","Data":"8ac88970a66c5bebaa7bb10292e4bf08943c17f8c48d408909dad0f40a946f1b"} Dec 15 14:07:53 crc kubenswrapper[4794]: I1215 14:07:53.026921 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" event={"ID":"b9245ea6-901c-4e5a-a3be-d7184ace8e8c","Type":"ContainerStarted","Data":"2caf077a775e87f2bf766ba1cef742bc8d12c95802d682084d4a104aa7769581"} Dec 15 14:07:53 crc kubenswrapper[4794]: I1215 14:07:53.030266 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" event={"ID":"32475a8d-44fa-4fd8-9b4a-e907db6bcd4e","Type":"ContainerStarted","Data":"1f96407851a8fd19206740c5701954c116589ea2c7635007720849a168f1d7f1"} Dec 15 14:07:53 crc kubenswrapper[4794]: I1215 14:07:53.030501 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" Dec 15 14:07:53 crc kubenswrapper[4794]: I1215 14:07:53.049187 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-p8r8k" podStartSLOduration=1.533149512 podStartE2EDuration="7.04916695s" podCreationTimestamp="2025-12-15 14:07:46 +0000 UTC" firstStartedPulling="2025-12-15 14:07:47.094640227 +0000 UTC m=+828.946662665" lastFinishedPulling="2025-12-15 14:07:52.610657645 +0000 UTC m=+834.462680103" observedRunningTime="2025-12-15 14:07:53.04348701 +0000 UTC m=+834.895509458" watchObservedRunningTime="2025-12-15 14:07:53.04916695 +0000 UTC m=+834.901189398" Dec 15 14:07:53 crc kubenswrapper[4794]: I1215 14:07:53.064336 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" podStartSLOduration=2.083717444 podStartE2EDuration="7.064307568s" podCreationTimestamp="2025-12-15 14:07:46 +0000 UTC" firstStartedPulling="2025-12-15 14:07:47.60242416 +0000 UTC m=+829.454446598" lastFinishedPulling="2025-12-15 14:07:52.583014284 +0000 UTC m=+834.435036722" observedRunningTime="2025-12-15 14:07:53.06117414 +0000 UTC m=+834.913196618" watchObservedRunningTime="2025-12-15 14:07:53.064307568 +0000 UTC m=+834.916330066" Dec 15 14:07:53 crc kubenswrapper[4794]: I1215 14:07:53.092477 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-m6bgr" podStartSLOduration=2.701757984 podStartE2EDuration="7.092445594s" podCreationTimestamp="2025-12-15 14:07:46 +0000 UTC" firstStartedPulling="2025-12-15 14:07:48.192474118 +0000 UTC m=+830.044496596" lastFinishedPulling="2025-12-15 14:07:52.583161728 +0000 UTC m=+834.435184206" observedRunningTime="2025-12-15 14:07:53.083848401 +0000 UTC m=+834.935870849" watchObservedRunningTime="2025-12-15 14:07:53.092445594 +0000 UTC m=+834.944468072" Dec 15 14:07:57 crc kubenswrapper[4794]: I1215 14:07:57.094537 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-p8r8k" Dec 15 14:07:57 crc kubenswrapper[4794]: I1215 14:07:57.401543 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:57 crc kubenswrapper[4794]: I1215 14:07:57.401824 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:57 crc kubenswrapper[4794]: I1215 14:07:57.409245 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:58 crc kubenswrapper[4794]: I1215 14:07:58.078521 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:07:58 crc kubenswrapper[4794]: I1215 14:07:58.161502 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rfx4t"] Dec 15 14:08:03 crc kubenswrapper[4794]: I1215 14:08:03.114165 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd" event={"ID":"1ae0e754-4486-4c14-b87d-df2cfc6a94dd","Type":"ContainerStarted","Data":"8d77b520ff2c5661af5eb375503754579313cd6e5444911f7d9bbef7358b33cf"} Dec 15 14:08:07 crc kubenswrapper[4794]: I1215 14:08:07.019030 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cblrd" Dec 15 14:08:07 crc kubenswrapper[4794]: I1215 14:08:07.044136 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-q2rhd" podStartSLOduration=6.199386125 podStartE2EDuration="21.044111102s" podCreationTimestamp="2025-12-15 14:07:46 +0000 UTC" firstStartedPulling="2025-12-15 14:07:47.363242509 +0000 UTC m=+829.215264947" lastFinishedPulling="2025-12-15 14:08:02.207967496 +0000 UTC m=+844.059989924" observedRunningTime="2025-12-15 14:08:03.139743095 +0000 UTC m=+844.991765553" watchObservedRunningTime="2025-12-15 14:08:07.044111102 +0000 UTC m=+848.896133570" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.344547 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp"] Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.346415 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.348146 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.352896 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp"] Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.395352 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz22d\" (UniqueName: \"kubernetes.io/projected/9913031c-bd75-4a6b-b917-338f5d8afbe4-kube-api-access-wz22d\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.395418 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.395481 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.496111 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.496163 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz22d\" (UniqueName: \"kubernetes.io/projected/9913031c-bd75-4a6b-b917-338f5d8afbe4-kube-api-access-wz22d\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.496201 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.496667 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.496742 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.517112 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz22d\" (UniqueName: \"kubernetes.io/projected/9913031c-bd75-4a6b-b917-338f5d8afbe4-kube-api-access-wz22d\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.662443 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:20 crc kubenswrapper[4794]: I1215 14:08:20.906998 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp"] Dec 15 14:08:20 crc kubenswrapper[4794]: W1215 14:08:20.913893 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9913031c_bd75_4a6b_b917_338f5d8afbe4.slice/crio-8da3351061e9520f2a3b7e4a96198e5471854f06b5f6407f2e31e17c90253f7e WatchSource:0}: Error finding container 8da3351061e9520f2a3b7e4a96198e5471854f06b5f6407f2e31e17c90253f7e: Status 404 returned error can't find the container with id 8da3351061e9520f2a3b7e4a96198e5471854f06b5f6407f2e31e17c90253f7e Dec 15 14:08:21 crc kubenswrapper[4794]: I1215 14:08:21.242756 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" event={"ID":"9913031c-bd75-4a6b-b917-338f5d8afbe4","Type":"ContainerStarted","Data":"8da3351061e9520f2a3b7e4a96198e5471854f06b5f6407f2e31e17c90253f7e"} Dec 15 14:08:22 crc kubenswrapper[4794]: I1215 14:08:22.249851 4794 generic.go:334] "Generic (PLEG): container finished" podID="9913031c-bd75-4a6b-b917-338f5d8afbe4" containerID="9e028436994a01487f40955dc6e8e2f0571e1627848bc0233a3ea7467a46038e" exitCode=0 Dec 15 14:08:22 crc kubenswrapper[4794]: I1215 14:08:22.249971 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" event={"ID":"9913031c-bd75-4a6b-b917-338f5d8afbe4","Type":"ContainerDied","Data":"9e028436994a01487f40955dc6e8e2f0571e1627848bc0233a3ea7467a46038e"} Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.245420 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rfx4t" podUID="aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" containerName="console" containerID="cri-o://575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb" gracePeriod=15 Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.592302 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rfx4t_aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33/console/0.log" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.592674 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.741506 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-service-ca\") pod \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.742142 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-service-ca" (OuterVolumeSpecName: "service-ca") pod "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" (UID: "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.742198 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-oauth-serving-cert\") pod \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.742251 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-config\") pod \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.742271 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr9z8\" (UniqueName: \"kubernetes.io/projected/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-kube-api-access-vr9z8\") pod \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.742308 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-oauth-config\") pod \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.742330 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-trusted-ca-bundle\") pod \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.742382 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-serving-cert\") pod \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\" (UID: \"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33\") " Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.742594 4794 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-service-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.743128 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" (UID: "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.743449 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-config" (OuterVolumeSpecName: "console-config") pod "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" (UID: "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.743937 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" (UID: "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.747383 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" (UID: "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.748599 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" (UID: "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.748673 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-kube-api-access-vr9z8" (OuterVolumeSpecName: "kube-api-access-vr9z8") pod "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" (UID: "aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33"). InnerVolumeSpecName "kube-api-access-vr9z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.843322 4794 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-config\") on node \"crc\" DevicePath \"\"" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.843360 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr9z8\" (UniqueName: \"kubernetes.io/projected/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-kube-api-access-vr9z8\") on node \"crc\" DevicePath \"\"" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.843375 4794 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.843387 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.843399 4794 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 14:08:23 crc kubenswrapper[4794]: I1215 14:08:23.843410 4794 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 14:08:24 crc kubenswrapper[4794]: I1215 14:08:24.269439 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rfx4t_aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33/console/0.log" Dec 15 14:08:24 crc kubenswrapper[4794]: I1215 14:08:24.269528 4794 generic.go:334] "Generic (PLEG): container finished" podID="aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" containerID="575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb" exitCode=2 Dec 15 14:08:24 crc kubenswrapper[4794]: I1215 14:08:24.269652 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rfx4t" event={"ID":"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33","Type":"ContainerDied","Data":"575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb"} Dec 15 14:08:24 crc kubenswrapper[4794]: I1215 14:08:24.269707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rfx4t" event={"ID":"aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33","Type":"ContainerDied","Data":"f7308c011bdd4f72c35f1891c28eb53cd6df20e3968172b5a6f965cb399d856b"} Dec 15 14:08:24 crc kubenswrapper[4794]: I1215 14:08:24.269730 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rfx4t" Dec 15 14:08:24 crc kubenswrapper[4794]: I1215 14:08:24.269745 4794 scope.go:117] "RemoveContainer" containerID="575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb" Dec 15 14:08:24 crc kubenswrapper[4794]: I1215 14:08:24.295976 4794 scope.go:117] "RemoveContainer" containerID="575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb" Dec 15 14:08:24 crc kubenswrapper[4794]: E1215 14:08:24.296435 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb\": container with ID starting with 575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb not found: ID does not exist" containerID="575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb" Dec 15 14:08:24 crc kubenswrapper[4794]: I1215 14:08:24.296527 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb"} err="failed to get container status \"575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb\": rpc error: code = NotFound desc = could not find container \"575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb\": container with ID starting with 575d667d8be86d6bebe3517ddef1a9e40cae9c6818541a40aac8e27de72bc9eb not found: ID does not exist" Dec 15 14:08:24 crc kubenswrapper[4794]: I1215 14:08:24.321279 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rfx4t"] Dec 15 14:08:24 crc kubenswrapper[4794]: I1215 14:08:24.329323 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rfx4t"] Dec 15 14:08:24 crc kubenswrapper[4794]: I1215 14:08:24.749845 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" path="/var/lib/kubelet/pods/aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33/volumes" Dec 15 14:08:26 crc kubenswrapper[4794]: I1215 14:08:26.296611 4794 generic.go:334] "Generic (PLEG): container finished" podID="9913031c-bd75-4a6b-b917-338f5d8afbe4" containerID="d52b026a9cf96364baf782a3f3ad02842f51d58defef44a2a63e6cc7724c9421" exitCode=0 Dec 15 14:08:26 crc kubenswrapper[4794]: I1215 14:08:26.296665 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" event={"ID":"9913031c-bd75-4a6b-b917-338f5d8afbe4","Type":"ContainerDied","Data":"d52b026a9cf96364baf782a3f3ad02842f51d58defef44a2a63e6cc7724c9421"} Dec 15 14:08:27 crc kubenswrapper[4794]: I1215 14:08:27.306575 4794 generic.go:334] "Generic (PLEG): container finished" podID="9913031c-bd75-4a6b-b917-338f5d8afbe4" containerID="1bf199500189b64a007b3a40a298341e185b8acb45759598acb7f4139b5ee0ba" exitCode=0 Dec 15 14:08:27 crc kubenswrapper[4794]: I1215 14:08:27.306719 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" event={"ID":"9913031c-bd75-4a6b-b917-338f5d8afbe4","Type":"ContainerDied","Data":"1bf199500189b64a007b3a40a298341e185b8acb45759598acb7f4139b5ee0ba"} Dec 15 14:08:28 crc kubenswrapper[4794]: I1215 14:08:28.588658 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:28 crc kubenswrapper[4794]: I1215 14:08:28.704028 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz22d\" (UniqueName: \"kubernetes.io/projected/9913031c-bd75-4a6b-b917-338f5d8afbe4-kube-api-access-wz22d\") pod \"9913031c-bd75-4a6b-b917-338f5d8afbe4\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " Dec 15 14:08:28 crc kubenswrapper[4794]: I1215 14:08:28.704369 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-bundle\") pod \"9913031c-bd75-4a6b-b917-338f5d8afbe4\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " Dec 15 14:08:28 crc kubenswrapper[4794]: I1215 14:08:28.704643 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-util\") pod \"9913031c-bd75-4a6b-b917-338f5d8afbe4\" (UID: \"9913031c-bd75-4a6b-b917-338f5d8afbe4\") " Dec 15 14:08:28 crc kubenswrapper[4794]: I1215 14:08:28.706656 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-bundle" (OuterVolumeSpecName: "bundle") pod "9913031c-bd75-4a6b-b917-338f5d8afbe4" (UID: "9913031c-bd75-4a6b-b917-338f5d8afbe4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:08:28 crc kubenswrapper[4794]: I1215 14:08:28.713062 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9913031c-bd75-4a6b-b917-338f5d8afbe4-kube-api-access-wz22d" (OuterVolumeSpecName: "kube-api-access-wz22d") pod "9913031c-bd75-4a6b-b917-338f5d8afbe4" (UID: "9913031c-bd75-4a6b-b917-338f5d8afbe4"). InnerVolumeSpecName "kube-api-access-wz22d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:08:28 crc kubenswrapper[4794]: I1215 14:08:28.724506 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-util" (OuterVolumeSpecName: "util") pod "9913031c-bd75-4a6b-b917-338f5d8afbe4" (UID: "9913031c-bd75-4a6b-b917-338f5d8afbe4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:08:28 crc kubenswrapper[4794]: I1215 14:08:28.806128 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz22d\" (UniqueName: \"kubernetes.io/projected/9913031c-bd75-4a6b-b917-338f5d8afbe4-kube-api-access-wz22d\") on node \"crc\" DevicePath \"\"" Dec 15 14:08:28 crc kubenswrapper[4794]: I1215 14:08:28.806390 4794 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:08:28 crc kubenswrapper[4794]: I1215 14:08:28.806469 4794 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9913031c-bd75-4a6b-b917-338f5d8afbe4-util\") on node \"crc\" DevicePath \"\"" Dec 15 14:08:29 crc kubenswrapper[4794]: I1215 14:08:29.322223 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" event={"ID":"9913031c-bd75-4a6b-b917-338f5d8afbe4","Type":"ContainerDied","Data":"8da3351061e9520f2a3b7e4a96198e5471854f06b5f6407f2e31e17c90253f7e"} Dec 15 14:08:29 crc kubenswrapper[4794]: I1215 14:08:29.322267 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8da3351061e9520f2a3b7e4a96198e5471854f06b5f6407f2e31e17c90253f7e" Dec 15 14:08:29 crc kubenswrapper[4794]: I1215 14:08:29.322382 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.836033 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9"] Dec 15 14:08:38 crc kubenswrapper[4794]: E1215 14:08:38.836821 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9913031c-bd75-4a6b-b917-338f5d8afbe4" containerName="pull" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.836835 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9913031c-bd75-4a6b-b917-338f5d8afbe4" containerName="pull" Dec 15 14:08:38 crc kubenswrapper[4794]: E1215 14:08:38.836843 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9913031c-bd75-4a6b-b917-338f5d8afbe4" containerName="util" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.836850 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9913031c-bd75-4a6b-b917-338f5d8afbe4" containerName="util" Dec 15 14:08:38 crc kubenswrapper[4794]: E1215 14:08:38.836860 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" containerName="console" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.836866 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" containerName="console" Dec 15 14:08:38 crc kubenswrapper[4794]: E1215 14:08:38.836876 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9913031c-bd75-4a6b-b917-338f5d8afbe4" containerName="extract" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.836881 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9913031c-bd75-4a6b-b917-338f5d8afbe4" containerName="extract" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.836988 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9913031c-bd75-4a6b-b917-338f5d8afbe4" containerName="extract" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.836999 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa7b6fc7-0eb1-44b4-a69b-d8826bc00e33" containerName="console" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.837469 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.843041 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.843100 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.843054 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.843274 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-58blv" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.843818 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.865506 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9"] Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.939198 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66f313bf-1362-4b0b-b516-9e4be299fb48-apiservice-cert\") pod \"metallb-operator-controller-manager-68bfc664d8-gmqz9\" (UID: \"66f313bf-1362-4b0b-b516-9e4be299fb48\") " pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.939260 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f5jh\" (UniqueName: \"kubernetes.io/projected/66f313bf-1362-4b0b-b516-9e4be299fb48-kube-api-access-4f5jh\") pod \"metallb-operator-controller-manager-68bfc664d8-gmqz9\" (UID: \"66f313bf-1362-4b0b-b516-9e4be299fb48\") " pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:38 crc kubenswrapper[4794]: I1215 14:08:38.939287 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66f313bf-1362-4b0b-b516-9e4be299fb48-webhook-cert\") pod \"metallb-operator-controller-manager-68bfc664d8-gmqz9\" (UID: \"66f313bf-1362-4b0b-b516-9e4be299fb48\") " pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.040237 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66f313bf-1362-4b0b-b516-9e4be299fb48-apiservice-cert\") pod \"metallb-operator-controller-manager-68bfc664d8-gmqz9\" (UID: \"66f313bf-1362-4b0b-b516-9e4be299fb48\") " pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.040299 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f5jh\" (UniqueName: \"kubernetes.io/projected/66f313bf-1362-4b0b-b516-9e4be299fb48-kube-api-access-4f5jh\") pod \"metallb-operator-controller-manager-68bfc664d8-gmqz9\" (UID: \"66f313bf-1362-4b0b-b516-9e4be299fb48\") " pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.040325 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66f313bf-1362-4b0b-b516-9e4be299fb48-webhook-cert\") pod \"metallb-operator-controller-manager-68bfc664d8-gmqz9\" (UID: \"66f313bf-1362-4b0b-b516-9e4be299fb48\") " pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.054510 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66f313bf-1362-4b0b-b516-9e4be299fb48-webhook-cert\") pod \"metallb-operator-controller-manager-68bfc664d8-gmqz9\" (UID: \"66f313bf-1362-4b0b-b516-9e4be299fb48\") " pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.054533 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66f313bf-1362-4b0b-b516-9e4be299fb48-apiservice-cert\") pod \"metallb-operator-controller-manager-68bfc664d8-gmqz9\" (UID: \"66f313bf-1362-4b0b-b516-9e4be299fb48\") " pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.067344 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f5jh\" (UniqueName: \"kubernetes.io/projected/66f313bf-1362-4b0b-b516-9e4be299fb48-kube-api-access-4f5jh\") pod \"metallb-operator-controller-manager-68bfc664d8-gmqz9\" (UID: \"66f313bf-1362-4b0b-b516-9e4be299fb48\") " pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.093765 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc"] Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.094420 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.096299 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.097331 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.097774 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-nknm4" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.122031 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc"] Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.141777 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b7d789c-382b-41c7-af8b-78625cab6ea7-apiservice-cert\") pod \"metallb-operator-webhook-server-9d7768595-tdwxc\" (UID: \"3b7d789c-382b-41c7-af8b-78625cab6ea7\") " pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.141818 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b7d789c-382b-41c7-af8b-78625cab6ea7-webhook-cert\") pod \"metallb-operator-webhook-server-9d7768595-tdwxc\" (UID: \"3b7d789c-382b-41c7-af8b-78625cab6ea7\") " pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.141842 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m4kq\" (UniqueName: \"kubernetes.io/projected/3b7d789c-382b-41c7-af8b-78625cab6ea7-kube-api-access-5m4kq\") pod \"metallb-operator-webhook-server-9d7768595-tdwxc\" (UID: \"3b7d789c-382b-41c7-af8b-78625cab6ea7\") " pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.159846 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.242966 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b7d789c-382b-41c7-af8b-78625cab6ea7-apiservice-cert\") pod \"metallb-operator-webhook-server-9d7768595-tdwxc\" (UID: \"3b7d789c-382b-41c7-af8b-78625cab6ea7\") " pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.243355 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b7d789c-382b-41c7-af8b-78625cab6ea7-webhook-cert\") pod \"metallb-operator-webhook-server-9d7768595-tdwxc\" (UID: \"3b7d789c-382b-41c7-af8b-78625cab6ea7\") " pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.243386 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m4kq\" (UniqueName: \"kubernetes.io/projected/3b7d789c-382b-41c7-af8b-78625cab6ea7-kube-api-access-5m4kq\") pod \"metallb-operator-webhook-server-9d7768595-tdwxc\" (UID: \"3b7d789c-382b-41c7-af8b-78625cab6ea7\") " pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.249746 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b7d789c-382b-41c7-af8b-78625cab6ea7-webhook-cert\") pod \"metallb-operator-webhook-server-9d7768595-tdwxc\" (UID: \"3b7d789c-382b-41c7-af8b-78625cab6ea7\") " pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.255136 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b7d789c-382b-41c7-af8b-78625cab6ea7-apiservice-cert\") pod \"metallb-operator-webhook-server-9d7768595-tdwxc\" (UID: \"3b7d789c-382b-41c7-af8b-78625cab6ea7\") " pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.281232 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m4kq\" (UniqueName: \"kubernetes.io/projected/3b7d789c-382b-41c7-af8b-78625cab6ea7-kube-api-access-5m4kq\") pod \"metallb-operator-webhook-server-9d7768595-tdwxc\" (UID: \"3b7d789c-382b-41c7-af8b-78625cab6ea7\") " pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.416707 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.433181 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9"] Dec 15 14:08:39 crc kubenswrapper[4794]: I1215 14:08:39.677615 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc"] Dec 15 14:08:39 crc kubenswrapper[4794]: W1215 14:08:39.684387 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b7d789c_382b_41c7_af8b_78625cab6ea7.slice/crio-8a6ee0ca95f8d6c467be890bd5ade6f566a8db20eb3dfb041fe290a9242c9b3f WatchSource:0}: Error finding container 8a6ee0ca95f8d6c467be890bd5ade6f566a8db20eb3dfb041fe290a9242c9b3f: Status 404 returned error can't find the container with id 8a6ee0ca95f8d6c467be890bd5ade6f566a8db20eb3dfb041fe290a9242c9b3f Dec 15 14:08:40 crc kubenswrapper[4794]: I1215 14:08:40.393848 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" event={"ID":"66f313bf-1362-4b0b-b516-9e4be299fb48","Type":"ContainerStarted","Data":"c00eedb2299fe5b923c1fa54ec1cf4d779d8b0033871a6421246c48a424c2dbd"} Dec 15 14:08:40 crc kubenswrapper[4794]: I1215 14:08:40.395367 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" event={"ID":"3b7d789c-382b-41c7-af8b-78625cab6ea7","Type":"ContainerStarted","Data":"8a6ee0ca95f8d6c467be890bd5ade6f566a8db20eb3dfb041fe290a9242c9b3f"} Dec 15 14:08:43 crc kubenswrapper[4794]: I1215 14:08:43.419251 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" event={"ID":"66f313bf-1362-4b0b-b516-9e4be299fb48","Type":"ContainerStarted","Data":"79e5f1a309a2f049b6f234590507bf798bbb753d0d009f02a88187d602a6f1c5"} Dec 15 14:08:43 crc kubenswrapper[4794]: I1215 14:08:43.419721 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:08:43 crc kubenswrapper[4794]: I1215 14:08:43.443628 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" podStartSLOduration=2.428503393 podStartE2EDuration="5.443612812s" podCreationTimestamp="2025-12-15 14:08:38 +0000 UTC" firstStartedPulling="2025-12-15 14:08:39.449737331 +0000 UTC m=+881.301759769" lastFinishedPulling="2025-12-15 14:08:42.464846739 +0000 UTC m=+884.316869188" observedRunningTime="2025-12-15 14:08:43.440363779 +0000 UTC m=+885.292386227" watchObservedRunningTime="2025-12-15 14:08:43.443612812 +0000 UTC m=+885.295635250" Dec 15 14:08:45 crc kubenswrapper[4794]: I1215 14:08:45.440134 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" event={"ID":"3b7d789c-382b-41c7-af8b-78625cab6ea7","Type":"ContainerStarted","Data":"5a4170ad426a77f397075600e28101f6abc99bcf8c1a8b83b1f1d65ac686853a"} Dec 15 14:08:45 crc kubenswrapper[4794]: I1215 14:08:45.440711 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:08:45 crc kubenswrapper[4794]: I1215 14:08:45.470553 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" podStartSLOduration=1.078912854 podStartE2EDuration="6.470535602s" podCreationTimestamp="2025-12-15 14:08:39 +0000 UTC" firstStartedPulling="2025-12-15 14:08:39.689359572 +0000 UTC m=+881.541382010" lastFinishedPulling="2025-12-15 14:08:45.08098232 +0000 UTC m=+886.933004758" observedRunningTime="2025-12-15 14:08:45.459683734 +0000 UTC m=+887.311706182" watchObservedRunningTime="2025-12-15 14:08:45.470535602 +0000 UTC m=+887.322558040" Dec 15 14:08:59 crc kubenswrapper[4794]: I1215 14:08:59.423273 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-9d7768595-tdwxc" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.162844 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-68bfc664d8-gmqz9" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.834733 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq"] Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.835373 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.839833 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.840830 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gq8mj" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.860620 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-6c7j6"] Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.876736 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq"] Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.877305 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.879433 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.880419 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.921264 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea296da9-d8b4-41a2-834e-119076ca46a8-metrics-certs\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.921326 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5bxv\" (UniqueName: \"kubernetes.io/projected/ea296da9-d8b4-41a2-834e-119076ca46a8-kube-api-access-w5bxv\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.921379 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcccw\" (UniqueName: \"kubernetes.io/projected/5975edc0-914a-4df8-824e-c83bfe9e2f49-kube-api-access-xcccw\") pod \"frr-k8s-webhook-server-7784b6fcf-jr8lq\" (UID: \"5975edc0-914a-4df8-824e-c83bfe9e2f49\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.921429 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-frr-sockets\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.921457 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ea296da9-d8b4-41a2-834e-119076ca46a8-frr-startup\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.921476 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-frr-conf\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.921512 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5975edc0-914a-4df8-824e-c83bfe9e2f49-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-jr8lq\" (UID: \"5975edc0-914a-4df8-824e-c83bfe9e2f49\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.921537 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-reloader\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.921560 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-metrics\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.939641 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8t6l8"] Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.940697 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8t6l8" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.943002 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.943213 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.943349 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.948885 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-jf8wk"] Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.948985 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-c7kvg" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.950386 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.957168 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 15 14:09:19 crc kubenswrapper[4794]: I1215 14:09:19.965882 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-jf8wk"] Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.022503 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9bsr\" (UniqueName: \"kubernetes.io/projected/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-kube-api-access-t9bsr\") pod \"controller-5bddd4b946-jf8wk\" (UID: \"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9\") " pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.022809 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5bxv\" (UniqueName: \"kubernetes.io/projected/ea296da9-d8b4-41a2-834e-119076ca46a8-kube-api-access-w5bxv\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.024738 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcccw\" (UniqueName: \"kubernetes.io/projected/5975edc0-914a-4df8-824e-c83bfe9e2f49-kube-api-access-xcccw\") pod \"frr-k8s-webhook-server-7784b6fcf-jr8lq\" (UID: \"5975edc0-914a-4df8-824e-c83bfe9e2f49\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.024788 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lbfd\" (UniqueName: \"kubernetes.io/projected/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-kube-api-access-7lbfd\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.024805 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-metrics-certs\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.024831 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-frr-sockets\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.024861 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ea296da9-d8b4-41a2-834e-119076ca46a8-frr-startup\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.024886 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-frr-conf\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.024911 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-metrics-certs\") pod \"controller-5bddd4b946-jf8wk\" (UID: \"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9\") " pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.024939 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-memberlist\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.024969 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5975edc0-914a-4df8-824e-c83bfe9e2f49-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-jr8lq\" (UID: \"5975edc0-914a-4df8-824e-c83bfe9e2f49\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.024993 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-reloader\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.025042 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-metrics\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.025074 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-cert\") pod \"controller-5bddd4b946-jf8wk\" (UID: \"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9\") " pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.025109 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-metallb-excludel2\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.025134 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea296da9-d8b4-41a2-834e-119076ca46a8-metrics-certs\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: E1215 14:09:20.025231 4794 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 15 14:09:20 crc kubenswrapper[4794]: E1215 14:09:20.025275 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea296da9-d8b4-41a2-834e-119076ca46a8-metrics-certs podName:ea296da9-d8b4-41a2-834e-119076ca46a8 nodeName:}" failed. No retries permitted until 2025-12-15 14:09:20.525258695 +0000 UTC m=+922.377281143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea296da9-d8b4-41a2-834e-119076ca46a8-metrics-certs") pod "frr-k8s-6c7j6" (UID: "ea296da9-d8b4-41a2-834e-119076ca46a8") : secret "frr-k8s-certs-secret" not found Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.025289 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-frr-sockets\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.025356 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-reloader\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.025509 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-frr-conf\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.025519 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ea296da9-d8b4-41a2-834e-119076ca46a8-metrics\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.025912 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ea296da9-d8b4-41a2-834e-119076ca46a8-frr-startup\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.033694 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5975edc0-914a-4df8-824e-c83bfe9e2f49-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-jr8lq\" (UID: \"5975edc0-914a-4df8-824e-c83bfe9e2f49\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.041852 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5bxv\" (UniqueName: \"kubernetes.io/projected/ea296da9-d8b4-41a2-834e-119076ca46a8-kube-api-access-w5bxv\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.046619 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcccw\" (UniqueName: \"kubernetes.io/projected/5975edc0-914a-4df8-824e-c83bfe9e2f49-kube-api-access-xcccw\") pod \"frr-k8s-webhook-server-7784b6fcf-jr8lq\" (UID: \"5975edc0-914a-4df8-824e-c83bfe9e2f49\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.126717 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-metallb-excludel2\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.126807 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9bsr\" (UniqueName: \"kubernetes.io/projected/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-kube-api-access-t9bsr\") pod \"controller-5bddd4b946-jf8wk\" (UID: \"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9\") " pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.126868 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbfd\" (UniqueName: \"kubernetes.io/projected/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-kube-api-access-7lbfd\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.126890 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-metrics-certs\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.126959 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-metrics-certs\") pod \"controller-5bddd4b946-jf8wk\" (UID: \"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9\") " pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:20 crc kubenswrapper[4794]: E1215 14:09:20.127141 4794 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 15 14:09:20 crc kubenswrapper[4794]: E1215 14:09:20.127206 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-metrics-certs podName:cb3f9a4c-a711-401e-a7fe-d4cec39be7d9 nodeName:}" failed. No retries permitted until 2025-12-15 14:09:20.627185274 +0000 UTC m=+922.479207712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-metrics-certs") pod "controller-5bddd4b946-jf8wk" (UID: "cb3f9a4c-a711-401e-a7fe-d4cec39be7d9") : secret "controller-certs-secret" not found Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.127509 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-memberlist\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.127554 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-cert\") pod \"controller-5bddd4b946-jf8wk\" (UID: \"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9\") " pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.127563 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-metallb-excludel2\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: E1215 14:09:20.127574 4794 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 15 14:09:20 crc kubenswrapper[4794]: E1215 14:09:20.127653 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-memberlist podName:433f5422-e79c-46f8-bc6d-7b6dbdaf2462 nodeName:}" failed. No retries permitted until 2025-12-15 14:09:20.627640337 +0000 UTC m=+922.479662865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-memberlist") pod "speaker-8t6l8" (UID: "433f5422-e79c-46f8-bc6d-7b6dbdaf2462") : secret "metallb-memberlist" not found Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.128835 4794 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.136167 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-metrics-certs\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.141502 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-cert\") pod \"controller-5bddd4b946-jf8wk\" (UID: \"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9\") " pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.145745 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9bsr\" (UniqueName: \"kubernetes.io/projected/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-kube-api-access-t9bsr\") pod \"controller-5bddd4b946-jf8wk\" (UID: \"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9\") " pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.146379 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lbfd\" (UniqueName: \"kubernetes.io/projected/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-kube-api-access-7lbfd\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.175762 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.532466 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea296da9-d8b4-41a2-834e-119076ca46a8-metrics-certs\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.537323 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea296da9-d8b4-41a2-834e-119076ca46a8-metrics-certs\") pod \"frr-k8s-6c7j6\" (UID: \"ea296da9-d8b4-41a2-834e-119076ca46a8\") " pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.602056 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq"] Dec 15 14:09:20 crc kubenswrapper[4794]: W1215 14:09:20.610246 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5975edc0_914a_4df8_824e_c83bfe9e2f49.slice/crio-d475065a9bc11c7fb0d40afec729aba1b79b4225fddcf74f00a0cdc0437ade3b WatchSource:0}: Error finding container d475065a9bc11c7fb0d40afec729aba1b79b4225fddcf74f00a0cdc0437ade3b: Status 404 returned error can't find the container with id d475065a9bc11c7fb0d40afec729aba1b79b4225fddcf74f00a0cdc0437ade3b Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.633863 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-metrics-certs\") pod \"controller-5bddd4b946-jf8wk\" (UID: \"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9\") " pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.633948 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-memberlist\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:20 crc kubenswrapper[4794]: E1215 14:09:20.634240 4794 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 15 14:09:20 crc kubenswrapper[4794]: E1215 14:09:20.634337 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-memberlist podName:433f5422-e79c-46f8-bc6d-7b6dbdaf2462 nodeName:}" failed. No retries permitted until 2025-12-15 14:09:21.634310118 +0000 UTC m=+923.486332596 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-memberlist") pod "speaker-8t6l8" (UID: "433f5422-e79c-46f8-bc6d-7b6dbdaf2462") : secret "metallb-memberlist" not found Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.639387 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb3f9a4c-a711-401e-a7fe-d4cec39be7d9-metrics-certs\") pod \"controller-5bddd4b946-jf8wk\" (UID: \"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9\") " pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.694085 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" event={"ID":"5975edc0-914a-4df8-824e-c83bfe9e2f49","Type":"ContainerStarted","Data":"d475065a9bc11c7fb0d40afec729aba1b79b4225fddcf74f00a0cdc0437ade3b"} Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.812341 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:20 crc kubenswrapper[4794]: I1215 14:09:20.904188 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:21 crc kubenswrapper[4794]: I1215 14:09:21.198325 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-jf8wk"] Dec 15 14:09:21 crc kubenswrapper[4794]: W1215 14:09:21.201805 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb3f9a4c_a711_401e_a7fe_d4cec39be7d9.slice/crio-252bbef3a9a8c1ace21f4809686e934f7d7651a6369e216b8bc22e79c456ef42 WatchSource:0}: Error finding container 252bbef3a9a8c1ace21f4809686e934f7d7651a6369e216b8bc22e79c456ef42: Status 404 returned error can't find the container with id 252bbef3a9a8c1ace21f4809686e934f7d7651a6369e216b8bc22e79c456ef42 Dec 15 14:09:21 crc kubenswrapper[4794]: I1215 14:09:21.656091 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-memberlist\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:21 crc kubenswrapper[4794]: I1215 14:09:21.664709 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/433f5422-e79c-46f8-bc6d-7b6dbdaf2462-memberlist\") pod \"speaker-8t6l8\" (UID: \"433f5422-e79c-46f8-bc6d-7b6dbdaf2462\") " pod="metallb-system/speaker-8t6l8" Dec 15 14:09:21 crc kubenswrapper[4794]: I1215 14:09:21.702193 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-jf8wk" event={"ID":"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9","Type":"ContainerStarted","Data":"342ef9662aa6ac7ff5007166d9103ca3cb5a73759fbf39b27bad11d4c955f449"} Dec 15 14:09:21 crc kubenswrapper[4794]: I1215 14:09:21.702252 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-jf8wk" event={"ID":"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9","Type":"ContainerStarted","Data":"84b370aadd0e68f88e2ec39b3fcf9baa48760a629dd57f79f1bbd7af73bd644f"} Dec 15 14:09:21 crc kubenswrapper[4794]: I1215 14:09:21.702280 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:21 crc kubenswrapper[4794]: I1215 14:09:21.702294 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-jf8wk" event={"ID":"cb3f9a4c-a711-401e-a7fe-d4cec39be7d9","Type":"ContainerStarted","Data":"252bbef3a9a8c1ace21f4809686e934f7d7651a6369e216b8bc22e79c456ef42"} Dec 15 14:09:21 crc kubenswrapper[4794]: I1215 14:09:21.703872 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6c7j6" event={"ID":"ea296da9-d8b4-41a2-834e-119076ca46a8","Type":"ContainerStarted","Data":"2854c3e4cc35d971782c27958f81693e3c2a53cfc22203eaad1473b31e5bfe15"} Dec 15 14:09:21 crc kubenswrapper[4794]: I1215 14:09:21.721197 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-jf8wk" podStartSLOduration=2.721172433 podStartE2EDuration="2.721172433s" podCreationTimestamp="2025-12-15 14:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:09:21.717414956 +0000 UTC m=+923.569437404" watchObservedRunningTime="2025-12-15 14:09:21.721172433 +0000 UTC m=+923.573194901" Dec 15 14:09:21 crc kubenswrapper[4794]: I1215 14:09:21.794964 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8t6l8" Dec 15 14:09:21 crc kubenswrapper[4794]: W1215 14:09:21.820229 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod433f5422_e79c_46f8_bc6d_7b6dbdaf2462.slice/crio-0342de2050020354ef96b0113afc343ed6d56169117c254f04ab5d3396110c3a WatchSource:0}: Error finding container 0342de2050020354ef96b0113afc343ed6d56169117c254f04ab5d3396110c3a: Status 404 returned error can't find the container with id 0342de2050020354ef96b0113afc343ed6d56169117c254f04ab5d3396110c3a Dec 15 14:09:22 crc kubenswrapper[4794]: I1215 14:09:22.711852 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8t6l8" event={"ID":"433f5422-e79c-46f8-bc6d-7b6dbdaf2462","Type":"ContainerStarted","Data":"f0fe4164f125dd8d11feeb54af9dfd504ddce303f2acfe3353b4c4a8b16ea750"} Dec 15 14:09:22 crc kubenswrapper[4794]: I1215 14:09:22.712197 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8t6l8" event={"ID":"433f5422-e79c-46f8-bc6d-7b6dbdaf2462","Type":"ContainerStarted","Data":"f9bd2fd4d17d7e2990621401e0ed976b4f52df72af0c4a328fbb033fa5af03d7"} Dec 15 14:09:22 crc kubenswrapper[4794]: I1215 14:09:22.712214 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8t6l8" event={"ID":"433f5422-e79c-46f8-bc6d-7b6dbdaf2462","Type":"ContainerStarted","Data":"0342de2050020354ef96b0113afc343ed6d56169117c254f04ab5d3396110c3a"} Dec 15 14:09:22 crc kubenswrapper[4794]: I1215 14:09:22.712576 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8t6l8" Dec 15 14:09:22 crc kubenswrapper[4794]: I1215 14:09:22.737699 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8t6l8" podStartSLOduration=3.737682345 podStartE2EDuration="3.737682345s" podCreationTimestamp="2025-12-15 14:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:09:22.733741713 +0000 UTC m=+924.585764161" watchObservedRunningTime="2025-12-15 14:09:22.737682345 +0000 UTC m=+924.589704783" Dec 15 14:09:24 crc kubenswrapper[4794]: I1215 14:09:24.534008 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:09:24 crc kubenswrapper[4794]: I1215 14:09:24.534384 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:09:28 crc kubenswrapper[4794]: I1215 14:09:28.762747 4794 generic.go:334] "Generic (PLEG): container finished" podID="ea296da9-d8b4-41a2-834e-119076ca46a8" containerID="9fac9d68fad508e25af31eaa88b6930cf8dfbad40f1422e2bad1182a28119701" exitCode=0 Dec 15 14:09:28 crc kubenswrapper[4794]: I1215 14:09:28.762817 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6c7j6" event={"ID":"ea296da9-d8b4-41a2-834e-119076ca46a8","Type":"ContainerDied","Data":"9fac9d68fad508e25af31eaa88b6930cf8dfbad40f1422e2bad1182a28119701"} Dec 15 14:09:28 crc kubenswrapper[4794]: I1215 14:09:28.768031 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" event={"ID":"5975edc0-914a-4df8-824e-c83bfe9e2f49","Type":"ContainerStarted","Data":"db5aaf740ed3c5bc7459dde342f61b1223d2275d609288f9b667ea1540dea248"} Dec 15 14:09:28 crc kubenswrapper[4794]: I1215 14:09:28.768525 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" Dec 15 14:09:28 crc kubenswrapper[4794]: I1215 14:09:28.824866 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" podStartSLOduration=2.14665854 podStartE2EDuration="9.824843237s" podCreationTimestamp="2025-12-15 14:09:19 +0000 UTC" firstStartedPulling="2025-12-15 14:09:20.61251305 +0000 UTC m=+922.464535498" lastFinishedPulling="2025-12-15 14:09:28.290697757 +0000 UTC m=+930.142720195" observedRunningTime="2025-12-15 14:09:28.813086603 +0000 UTC m=+930.665109051" watchObservedRunningTime="2025-12-15 14:09:28.824843237 +0000 UTC m=+930.676865675" Dec 15 14:09:29 crc kubenswrapper[4794]: I1215 14:09:29.776666 4794 generic.go:334] "Generic (PLEG): container finished" podID="ea296da9-d8b4-41a2-834e-119076ca46a8" containerID="a63d31e70ca78d89f68074e85b17fc36d59d6d3473aed73894f76e77820b88c2" exitCode=0 Dec 15 14:09:29 crc kubenswrapper[4794]: I1215 14:09:29.776793 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6c7j6" event={"ID":"ea296da9-d8b4-41a2-834e-119076ca46a8","Type":"ContainerDied","Data":"a63d31e70ca78d89f68074e85b17fc36d59d6d3473aed73894f76e77820b88c2"} Dec 15 14:09:30 crc kubenswrapper[4794]: I1215 14:09:30.789392 4794 generic.go:334] "Generic (PLEG): container finished" podID="ea296da9-d8b4-41a2-834e-119076ca46a8" containerID="4957e095c3910cb95ed5c08a0c8f365df7f411bbd7607ba0ab4b4f28d7c1b443" exitCode=0 Dec 15 14:09:30 crc kubenswrapper[4794]: I1215 14:09:30.789463 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6c7j6" event={"ID":"ea296da9-d8b4-41a2-834e-119076ca46a8","Type":"ContainerDied","Data":"4957e095c3910cb95ed5c08a0c8f365df7f411bbd7607ba0ab4b4f28d7c1b443"} Dec 15 14:09:31 crc kubenswrapper[4794]: I1215 14:09:31.802301 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6c7j6" event={"ID":"ea296da9-d8b4-41a2-834e-119076ca46a8","Type":"ContainerStarted","Data":"fb4565d63faebb15b771a51f7714d5c7bae2b43a0a1e31bab7fcdf0f0b4ce38f"} Dec 15 14:09:31 crc kubenswrapper[4794]: I1215 14:09:31.802694 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6c7j6" event={"ID":"ea296da9-d8b4-41a2-834e-119076ca46a8","Type":"ContainerStarted","Data":"2d12f3bb97339de1f32dcf30b1545a442144edf966438bbfc30b7c1d99cde0ef"} Dec 15 14:09:31 crc kubenswrapper[4794]: I1215 14:09:31.802709 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6c7j6" event={"ID":"ea296da9-d8b4-41a2-834e-119076ca46a8","Type":"ContainerStarted","Data":"da08ea973ff7c576ee02f9c808de4d563ccbefaa52f683fd11110d496575339c"} Dec 15 14:09:31 crc kubenswrapper[4794]: I1215 14:09:31.802721 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6c7j6" event={"ID":"ea296da9-d8b4-41a2-834e-119076ca46a8","Type":"ContainerStarted","Data":"4e772fffddbfba2e9bf01e26abcc6deda77e06619912d85e6dcc0a268c87a5fe"} Dec 15 14:09:31 crc kubenswrapper[4794]: I1215 14:09:31.802732 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6c7j6" event={"ID":"ea296da9-d8b4-41a2-834e-119076ca46a8","Type":"ContainerStarted","Data":"fb18e4910c5eb0f19c5c8f15ce42aed6b80400fabf30651f0734b1b5c3326360"} Dec 15 14:09:31 crc kubenswrapper[4794]: I1215 14:09:31.802743 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6c7j6" event={"ID":"ea296da9-d8b4-41a2-834e-119076ca46a8","Type":"ContainerStarted","Data":"14069ca45bb4df9de181827ed39cc5ff88bd1a4fc3e3fe261e536554d5e493f2"} Dec 15 14:09:31 crc kubenswrapper[4794]: I1215 14:09:31.827005 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-6c7j6" podStartSLOduration=5.523900282 podStartE2EDuration="12.826987958s" podCreationTimestamp="2025-12-15 14:09:19 +0000 UTC" firstStartedPulling="2025-12-15 14:09:21.002270866 +0000 UTC m=+922.854293304" lastFinishedPulling="2025-12-15 14:09:28.305358542 +0000 UTC m=+930.157380980" observedRunningTime="2025-12-15 14:09:31.823277383 +0000 UTC m=+933.675299911" watchObservedRunningTime="2025-12-15 14:09:31.826987958 +0000 UTC m=+933.679010416" Dec 15 14:09:32 crc kubenswrapper[4794]: I1215 14:09:32.810079 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:35 crc kubenswrapper[4794]: I1215 14:09:35.812763 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:35 crc kubenswrapper[4794]: I1215 14:09:35.852509 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:40 crc kubenswrapper[4794]: I1215 14:09:40.183233 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jr8lq" Dec 15 14:09:40 crc kubenswrapper[4794]: I1215 14:09:40.816194 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-6c7j6" Dec 15 14:09:40 crc kubenswrapper[4794]: I1215 14:09:40.916481 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-jf8wk" Dec 15 14:09:41 crc kubenswrapper[4794]: I1215 14:09:41.798996 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8t6l8" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.295967 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn"] Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.298114 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.300161 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.311753 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn"] Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.459749 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.459813 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.459832 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljnh9\" (UniqueName: \"kubernetes.io/projected/b00203fa-f8ca-49f8-9a8d-1908f8414ead-kube-api-access-ljnh9\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.561509 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.561608 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.561636 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljnh9\" (UniqueName: \"kubernetes.io/projected/b00203fa-f8ca-49f8-9a8d-1908f8414ead-kube-api-access-ljnh9\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.562113 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.562152 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.579676 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljnh9\" (UniqueName: \"kubernetes.io/projected/b00203fa-f8ca-49f8-9a8d-1908f8414ead-kube-api-access-ljnh9\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.614871 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.840408 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn"] Dec 15 14:09:43 crc kubenswrapper[4794]: W1215 14:09:43.844209 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb00203fa_f8ca_49f8_9a8d_1908f8414ead.slice/crio-a20699ee648acee0bdbb24cd2e08a22156ea52c91ce850f8670be2ac85bcf272 WatchSource:0}: Error finding container a20699ee648acee0bdbb24cd2e08a22156ea52c91ce850f8670be2ac85bcf272: Status 404 returned error can't find the container with id a20699ee648acee0bdbb24cd2e08a22156ea52c91ce850f8670be2ac85bcf272 Dec 15 14:09:43 crc kubenswrapper[4794]: I1215 14:09:43.890068 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" event={"ID":"b00203fa-f8ca-49f8-9a8d-1908f8414ead","Type":"ContainerStarted","Data":"a20699ee648acee0bdbb24cd2e08a22156ea52c91ce850f8670be2ac85bcf272"} Dec 15 14:09:44 crc kubenswrapper[4794]: I1215 14:09:44.899683 4794 generic.go:334] "Generic (PLEG): container finished" podID="b00203fa-f8ca-49f8-9a8d-1908f8414ead" containerID="712e54df5c4ac5510f05ec6c6c5450bd0719600898cc0d6078574bf765001742" exitCode=0 Dec 15 14:09:44 crc kubenswrapper[4794]: I1215 14:09:44.899788 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" event={"ID":"b00203fa-f8ca-49f8-9a8d-1908f8414ead","Type":"ContainerDied","Data":"712e54df5c4ac5510f05ec6c6c5450bd0719600898cc0d6078574bf765001742"} Dec 15 14:09:48 crc kubenswrapper[4794]: I1215 14:09:48.930219 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" event={"ID":"b00203fa-f8ca-49f8-9a8d-1908f8414ead","Type":"ContainerStarted","Data":"64db5e913c69b4f49327516b509bdef8fb9fb6cac104ac8c2f748b36e5f2d5a0"} Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.575910 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lqcfg"] Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.577524 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.600050 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lqcfg"] Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.641634 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-catalog-content\") pod \"community-operators-lqcfg\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.641698 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-utilities\") pod \"community-operators-lqcfg\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.641786 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr8hc\" (UniqueName: \"kubernetes.io/projected/80a337e8-5e48-4eb7-a448-be437e44390b-kube-api-access-kr8hc\") pod \"community-operators-lqcfg\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.742475 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-catalog-content\") pod \"community-operators-lqcfg\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.742525 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-utilities\") pod \"community-operators-lqcfg\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.742663 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr8hc\" (UniqueName: \"kubernetes.io/projected/80a337e8-5e48-4eb7-a448-be437e44390b-kube-api-access-kr8hc\") pod \"community-operators-lqcfg\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.743005 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-catalog-content\") pod \"community-operators-lqcfg\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.743247 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-utilities\") pod \"community-operators-lqcfg\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.764468 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr8hc\" (UniqueName: \"kubernetes.io/projected/80a337e8-5e48-4eb7-a448-be437e44390b-kube-api-access-kr8hc\") pod \"community-operators-lqcfg\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.895783 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.938505 4794 generic.go:334] "Generic (PLEG): container finished" podID="b00203fa-f8ca-49f8-9a8d-1908f8414ead" containerID="64db5e913c69b4f49327516b509bdef8fb9fb6cac104ac8c2f748b36e5f2d5a0" exitCode=0 Dec 15 14:09:49 crc kubenswrapper[4794]: I1215 14:09:49.938567 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" event={"ID":"b00203fa-f8ca-49f8-9a8d-1908f8414ead","Type":"ContainerDied","Data":"64db5e913c69b4f49327516b509bdef8fb9fb6cac104ac8c2f748b36e5f2d5a0"} Dec 15 14:09:50 crc kubenswrapper[4794]: I1215 14:09:50.441693 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lqcfg"] Dec 15 14:09:50 crc kubenswrapper[4794]: W1215 14:09:50.455093 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80a337e8_5e48_4eb7_a448_be437e44390b.slice/crio-a69b9b5aa8a31b3aed3b34f3f02106765481e1de60b777f9426c403dd7c99576 WatchSource:0}: Error finding container a69b9b5aa8a31b3aed3b34f3f02106765481e1de60b777f9426c403dd7c99576: Status 404 returned error can't find the container with id a69b9b5aa8a31b3aed3b34f3f02106765481e1de60b777f9426c403dd7c99576 Dec 15 14:09:50 crc kubenswrapper[4794]: I1215 14:09:50.947222 4794 generic.go:334] "Generic (PLEG): container finished" podID="80a337e8-5e48-4eb7-a448-be437e44390b" containerID="edc00ce7869404b0c83b44410f7859a33d7dd66bdb700fe2366b27a75bda2229" exitCode=0 Dec 15 14:09:50 crc kubenswrapper[4794]: I1215 14:09:50.947273 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqcfg" event={"ID":"80a337e8-5e48-4eb7-a448-be437e44390b","Type":"ContainerDied","Data":"edc00ce7869404b0c83b44410f7859a33d7dd66bdb700fe2366b27a75bda2229"} Dec 15 14:09:50 crc kubenswrapper[4794]: I1215 14:09:50.947707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqcfg" event={"ID":"80a337e8-5e48-4eb7-a448-be437e44390b","Type":"ContainerStarted","Data":"a69b9b5aa8a31b3aed3b34f3f02106765481e1de60b777f9426c403dd7c99576"} Dec 15 14:09:50 crc kubenswrapper[4794]: I1215 14:09:50.958632 4794 generic.go:334] "Generic (PLEG): container finished" podID="b00203fa-f8ca-49f8-9a8d-1908f8414ead" containerID="9adb7fa2df1be77228f9b801a8d4281a7098786c460c9b6e2722239772a79e99" exitCode=0 Dec 15 14:09:50 crc kubenswrapper[4794]: I1215 14:09:50.958759 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" event={"ID":"b00203fa-f8ca-49f8-9a8d-1908f8414ead","Type":"ContainerDied","Data":"9adb7fa2df1be77228f9b801a8d4281a7098786c460c9b6e2722239772a79e99"} Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.213281 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.388223 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-bundle\") pod \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.388340 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-util\") pod \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.388432 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljnh9\" (UniqueName: \"kubernetes.io/projected/b00203fa-f8ca-49f8-9a8d-1908f8414ead-kube-api-access-ljnh9\") pod \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\" (UID: \"b00203fa-f8ca-49f8-9a8d-1908f8414ead\") " Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.390177 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-bundle" (OuterVolumeSpecName: "bundle") pod "b00203fa-f8ca-49f8-9a8d-1908f8414ead" (UID: "b00203fa-f8ca-49f8-9a8d-1908f8414ead"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.394000 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00203fa-f8ca-49f8-9a8d-1908f8414ead-kube-api-access-ljnh9" (OuterVolumeSpecName: "kube-api-access-ljnh9") pod "b00203fa-f8ca-49f8-9a8d-1908f8414ead" (UID: "b00203fa-f8ca-49f8-9a8d-1908f8414ead"). InnerVolumeSpecName "kube-api-access-ljnh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.402165 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-util" (OuterVolumeSpecName: "util") pod "b00203fa-f8ca-49f8-9a8d-1908f8414ead" (UID: "b00203fa-f8ca-49f8-9a8d-1908f8414ead"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.490271 4794 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-util\") on node \"crc\" DevicePath \"\"" Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.490312 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljnh9\" (UniqueName: \"kubernetes.io/projected/b00203fa-f8ca-49f8-9a8d-1908f8414ead-kube-api-access-ljnh9\") on node \"crc\" DevicePath \"\"" Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.490325 4794 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00203fa-f8ca-49f8-9a8d-1908f8414ead-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.972563 4794 generic.go:334] "Generic (PLEG): container finished" podID="80a337e8-5e48-4eb7-a448-be437e44390b" containerID="3e762d637079729f1b0ea0502fa67de7862b2860446ee7cad88ea75b40f171d8" exitCode=0 Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.972683 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqcfg" event={"ID":"80a337e8-5e48-4eb7-a448-be437e44390b","Type":"ContainerDied","Data":"3e762d637079729f1b0ea0502fa67de7862b2860446ee7cad88ea75b40f171d8"} Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.975121 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" event={"ID":"b00203fa-f8ca-49f8-9a8d-1908f8414ead","Type":"ContainerDied","Data":"a20699ee648acee0bdbb24cd2e08a22156ea52c91ce850f8670be2ac85bcf272"} Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.975153 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn" Dec 15 14:09:52 crc kubenswrapper[4794]: I1215 14:09:52.975162 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a20699ee648acee0bdbb24cd2e08a22156ea52c91ce850f8670be2ac85bcf272" Dec 15 14:09:53 crc kubenswrapper[4794]: I1215 14:09:53.983138 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqcfg" event={"ID":"80a337e8-5e48-4eb7-a448-be437e44390b","Type":"ContainerStarted","Data":"ee12e2a0a0698258bd335ccf558ee1fbfaf42ff34afb315744e41e37424104b1"} Dec 15 14:09:54 crc kubenswrapper[4794]: I1215 14:09:54.076937 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lqcfg" podStartSLOduration=2.596326302 podStartE2EDuration="5.07691877s" podCreationTimestamp="2025-12-15 14:09:49 +0000 UTC" firstStartedPulling="2025-12-15 14:09:50.950661151 +0000 UTC m=+952.802683589" lastFinishedPulling="2025-12-15 14:09:53.431253619 +0000 UTC m=+955.283276057" observedRunningTime="2025-12-15 14:09:54.072775612 +0000 UTC m=+955.924798060" watchObservedRunningTime="2025-12-15 14:09:54.07691877 +0000 UTC m=+955.928941208" Dec 15 14:09:54 crc kubenswrapper[4794]: I1215 14:09:54.534231 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:09:54 crc kubenswrapper[4794]: I1215 14:09:54.534289 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.944037 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps"] Dec 15 14:09:56 crc kubenswrapper[4794]: E1215 14:09:56.944641 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00203fa-f8ca-49f8-9a8d-1908f8414ead" containerName="pull" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.944657 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00203fa-f8ca-49f8-9a8d-1908f8414ead" containerName="pull" Dec 15 14:09:56 crc kubenswrapper[4794]: E1215 14:09:56.944671 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00203fa-f8ca-49f8-9a8d-1908f8414ead" containerName="extract" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.944679 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00203fa-f8ca-49f8-9a8d-1908f8414ead" containerName="extract" Dec 15 14:09:56 crc kubenswrapper[4794]: E1215 14:09:56.944698 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00203fa-f8ca-49f8-9a8d-1908f8414ead" containerName="util" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.944706 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00203fa-f8ca-49f8-9a8d-1908f8414ead" containerName="util" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.944818 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00203fa-f8ca-49f8-9a8d-1908f8414ead" containerName="extract" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.945248 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.949080 4794 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-hdc5f" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.949419 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.949634 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.958497 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-446tw\" (UniqueName: \"kubernetes.io/projected/91c7760a-6182-4ad2-b4a6-62a09fa479c6-kube-api-access-446tw\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27pps\" (UID: \"91c7760a-6182-4ad2-b4a6-62a09fa479c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.958550 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91c7760a-6182-4ad2-b4a6-62a09fa479c6-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27pps\" (UID: \"91c7760a-6182-4ad2-b4a6-62a09fa479c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps" Dec 15 14:09:56 crc kubenswrapper[4794]: I1215 14:09:56.965954 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps"] Dec 15 14:09:57 crc kubenswrapper[4794]: I1215 14:09:57.060103 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-446tw\" (UniqueName: \"kubernetes.io/projected/91c7760a-6182-4ad2-b4a6-62a09fa479c6-kube-api-access-446tw\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27pps\" (UID: \"91c7760a-6182-4ad2-b4a6-62a09fa479c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps" Dec 15 14:09:57 crc kubenswrapper[4794]: I1215 14:09:57.060161 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91c7760a-6182-4ad2-b4a6-62a09fa479c6-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27pps\" (UID: \"91c7760a-6182-4ad2-b4a6-62a09fa479c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps" Dec 15 14:09:57 crc kubenswrapper[4794]: I1215 14:09:57.060714 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91c7760a-6182-4ad2-b4a6-62a09fa479c6-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27pps\" (UID: \"91c7760a-6182-4ad2-b4a6-62a09fa479c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps" Dec 15 14:09:57 crc kubenswrapper[4794]: I1215 14:09:57.093942 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-446tw\" (UniqueName: \"kubernetes.io/projected/91c7760a-6182-4ad2-b4a6-62a09fa479c6-kube-api-access-446tw\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27pps\" (UID: \"91c7760a-6182-4ad2-b4a6-62a09fa479c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps" Dec 15 14:09:57 crc kubenswrapper[4794]: I1215 14:09:57.260834 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps" Dec 15 14:09:57 crc kubenswrapper[4794]: I1215 14:09:57.457337 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps"] Dec 15 14:09:57 crc kubenswrapper[4794]: W1215 14:09:57.469805 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91c7760a_6182_4ad2_b4a6_62a09fa479c6.slice/crio-e2372a24f43180cbc228e35a9b293723eb2ac4ef22056c3ef30b91c21e9f1205 WatchSource:0}: Error finding container e2372a24f43180cbc228e35a9b293723eb2ac4ef22056c3ef30b91c21e9f1205: Status 404 returned error can't find the container with id e2372a24f43180cbc228e35a9b293723eb2ac4ef22056c3ef30b91c21e9f1205 Dec 15 14:09:58 crc kubenswrapper[4794]: I1215 14:09:58.008131 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps" event={"ID":"91c7760a-6182-4ad2-b4a6-62a09fa479c6","Type":"ContainerStarted","Data":"e2372a24f43180cbc228e35a9b293723eb2ac4ef22056c3ef30b91c21e9f1205"} Dec 15 14:09:59 crc kubenswrapper[4794]: I1215 14:09:59.896984 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:59 crc kubenswrapper[4794]: I1215 14:09:59.897309 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:09:59 crc kubenswrapper[4794]: I1215 14:09:59.950119 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:10:00 crc kubenswrapper[4794]: I1215 14:10:00.084911 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.583994 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sfjvp"] Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.585527 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.605191 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfjvp"] Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.634180 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-utilities\") pod \"redhat-marketplace-sfjvp\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.634247 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-catalog-content\") pod \"redhat-marketplace-sfjvp\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.634326 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp585\" (UniqueName: \"kubernetes.io/projected/1adc964b-5e7f-4685-ae6f-325531ddb5c8-kube-api-access-sp585\") pod \"redhat-marketplace-sfjvp\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.735780 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-utilities\") pod \"redhat-marketplace-sfjvp\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.735851 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-catalog-content\") pod \"redhat-marketplace-sfjvp\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.735911 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp585\" (UniqueName: \"kubernetes.io/projected/1adc964b-5e7f-4685-ae6f-325531ddb5c8-kube-api-access-sp585\") pod \"redhat-marketplace-sfjvp\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.736485 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-catalog-content\") pod \"redhat-marketplace-sfjvp\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.736475 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-utilities\") pod \"redhat-marketplace-sfjvp\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.766238 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp585\" (UniqueName: \"kubernetes.io/projected/1adc964b-5e7f-4685-ae6f-325531ddb5c8-kube-api-access-sp585\") pod \"redhat-marketplace-sfjvp\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.769937 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lqcfg"] Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.770273 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lqcfg" podUID="80a337e8-5e48-4eb7-a448-be437e44390b" containerName="registry-server" containerID="cri-o://ee12e2a0a0698258bd335ccf558ee1fbfaf42ff34afb315744e41e37424104b1" gracePeriod=2 Dec 15 14:10:02 crc kubenswrapper[4794]: I1215 14:10:02.915983 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:03 crc kubenswrapper[4794]: I1215 14:10:03.061666 4794 generic.go:334] "Generic (PLEG): container finished" podID="80a337e8-5e48-4eb7-a448-be437e44390b" containerID="ee12e2a0a0698258bd335ccf558ee1fbfaf42ff34afb315744e41e37424104b1" exitCode=0 Dec 15 14:10:03 crc kubenswrapper[4794]: I1215 14:10:03.061733 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqcfg" event={"ID":"80a337e8-5e48-4eb7-a448-be437e44390b","Type":"ContainerDied","Data":"ee12e2a0a0698258bd335ccf558ee1fbfaf42ff34afb315744e41e37424104b1"} Dec 15 14:10:03 crc kubenswrapper[4794]: I1215 14:10:03.065077 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps" event={"ID":"91c7760a-6182-4ad2-b4a6-62a09fa479c6","Type":"ContainerStarted","Data":"edeef5f02135493e9d04e8f8526ea851fb4ed2263d571240ac9d4cf9f947ea08"} Dec 15 14:10:03 crc kubenswrapper[4794]: I1215 14:10:03.098452 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27pps" podStartSLOduration=2.469219403 podStartE2EDuration="7.098432382s" podCreationTimestamp="2025-12-15 14:09:56 +0000 UTC" firstStartedPulling="2025-12-15 14:09:57.471531075 +0000 UTC m=+959.323553513" lastFinishedPulling="2025-12-15 14:10:02.100744024 +0000 UTC m=+963.952766492" observedRunningTime="2025-12-15 14:10:03.086998268 +0000 UTC m=+964.939020726" watchObservedRunningTime="2025-12-15 14:10:03.098432382 +0000 UTC m=+964.950454820" Dec 15 14:10:03 crc kubenswrapper[4794]: I1215 14:10:03.409515 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfjvp"] Dec 15 14:10:03 crc kubenswrapper[4794]: I1215 14:10:03.977898 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.074233 4794 generic.go:334] "Generic (PLEG): container finished" podID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" containerID="ff2bd304f24019e0a45fbe5c446dcfad3e6f92be3bef4125630dee1b521ba1ec" exitCode=0 Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.074289 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfjvp" event={"ID":"1adc964b-5e7f-4685-ae6f-325531ddb5c8","Type":"ContainerDied","Data":"ff2bd304f24019e0a45fbe5c446dcfad3e6f92be3bef4125630dee1b521ba1ec"} Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.074366 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfjvp" event={"ID":"1adc964b-5e7f-4685-ae6f-325531ddb5c8","Type":"ContainerStarted","Data":"408b51e01fdefce270556be91a11a34d7b8ac5073aef2ecedafe32d53034368a"} Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.077458 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqcfg" event={"ID":"80a337e8-5e48-4eb7-a448-be437e44390b","Type":"ContainerDied","Data":"a69b9b5aa8a31b3aed3b34f3f02106765481e1de60b777f9426c403dd7c99576"} Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.077491 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqcfg" Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.077509 4794 scope.go:117] "RemoveContainer" containerID="ee12e2a0a0698258bd335ccf558ee1fbfaf42ff34afb315744e41e37424104b1" Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.098035 4794 scope.go:117] "RemoveContainer" containerID="3e762d637079729f1b0ea0502fa67de7862b2860446ee7cad88ea75b40f171d8" Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.123109 4794 scope.go:117] "RemoveContainer" containerID="edc00ce7869404b0c83b44410f7859a33d7dd66bdb700fe2366b27a75bda2229" Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.154987 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-utilities\") pod \"80a337e8-5e48-4eb7-a448-be437e44390b\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.155182 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-catalog-content\") pod \"80a337e8-5e48-4eb7-a448-be437e44390b\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.155303 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr8hc\" (UniqueName: \"kubernetes.io/projected/80a337e8-5e48-4eb7-a448-be437e44390b-kube-api-access-kr8hc\") pod \"80a337e8-5e48-4eb7-a448-be437e44390b\" (UID: \"80a337e8-5e48-4eb7-a448-be437e44390b\") " Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.156006 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-utilities" (OuterVolumeSpecName: "utilities") pod "80a337e8-5e48-4eb7-a448-be437e44390b" (UID: "80a337e8-5e48-4eb7-a448-be437e44390b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.160939 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a337e8-5e48-4eb7-a448-be437e44390b-kube-api-access-kr8hc" (OuterVolumeSpecName: "kube-api-access-kr8hc") pod "80a337e8-5e48-4eb7-a448-be437e44390b" (UID: "80a337e8-5e48-4eb7-a448-be437e44390b"). InnerVolumeSpecName "kube-api-access-kr8hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.209183 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80a337e8-5e48-4eb7-a448-be437e44390b" (UID: "80a337e8-5e48-4eb7-a448-be437e44390b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.256307 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.256337 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80a337e8-5e48-4eb7-a448-be437e44390b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.256348 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr8hc\" (UniqueName: \"kubernetes.io/projected/80a337e8-5e48-4eb7-a448-be437e44390b-kube-api-access-kr8hc\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.421467 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lqcfg"] Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.430494 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lqcfg"] Dec 15 14:10:04 crc kubenswrapper[4794]: I1215 14:10:04.746787 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80a337e8-5e48-4eb7-a448-be437e44390b" path="/var/lib/kubelet/pods/80a337e8-5e48-4eb7-a448-be437e44390b/volumes" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.659362 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-b67fp"] Dec 15 14:10:05 crc kubenswrapper[4794]: E1215 14:10:05.659957 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a337e8-5e48-4eb7-a448-be437e44390b" containerName="registry-server" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.659974 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a337e8-5e48-4eb7-a448-be437e44390b" containerName="registry-server" Dec 15 14:10:05 crc kubenswrapper[4794]: E1215 14:10:05.660002 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a337e8-5e48-4eb7-a448-be437e44390b" containerName="extract-content" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.660009 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a337e8-5e48-4eb7-a448-be437e44390b" containerName="extract-content" Dec 15 14:10:05 crc kubenswrapper[4794]: E1215 14:10:05.660021 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a337e8-5e48-4eb7-a448-be437e44390b" containerName="extract-utilities" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.660031 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a337e8-5e48-4eb7-a448-be437e44390b" containerName="extract-utilities" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.660166 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a337e8-5e48-4eb7-a448-be437e44390b" containerName="registry-server" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.660705 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.662912 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.663298 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.665633 4794 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-9n4tm" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.667221 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-b67fp"] Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.779769 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9w42\" (UniqueName: \"kubernetes.io/projected/c99d997c-3d17-4555-b2c9-65f58c71088b-kube-api-access-n9w42\") pod \"cert-manager-webhook-f4fb5df64-b67fp\" (UID: \"c99d997c-3d17-4555-b2c9-65f58c71088b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.779910 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c99d997c-3d17-4555-b2c9-65f58c71088b-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-b67fp\" (UID: \"c99d997c-3d17-4555-b2c9-65f58c71088b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.880476 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9w42\" (UniqueName: \"kubernetes.io/projected/c99d997c-3d17-4555-b2c9-65f58c71088b-kube-api-access-n9w42\") pod \"cert-manager-webhook-f4fb5df64-b67fp\" (UID: \"c99d997c-3d17-4555-b2c9-65f58c71088b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.880553 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c99d997c-3d17-4555-b2c9-65f58c71088b-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-b67fp\" (UID: \"c99d997c-3d17-4555-b2c9-65f58c71088b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.910255 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c99d997c-3d17-4555-b2c9-65f58c71088b-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-b67fp\" (UID: \"c99d997c-3d17-4555-b2c9-65f58c71088b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.916468 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9w42\" (UniqueName: \"kubernetes.io/projected/c99d997c-3d17-4555-b2c9-65f58c71088b-kube-api-access-n9w42\") pod \"cert-manager-webhook-f4fb5df64-b67fp\" (UID: \"c99d997c-3d17-4555-b2c9-65f58c71088b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" Dec 15 14:10:05 crc kubenswrapper[4794]: I1215 14:10:05.986239 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" Dec 15 14:10:06 crc kubenswrapper[4794]: I1215 14:10:06.272838 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-b67fp"] Dec 15 14:10:06 crc kubenswrapper[4794]: W1215 14:10:06.277848 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc99d997c_3d17_4555_b2c9_65f58c71088b.slice/crio-a8522713f4f271bf9a323c8d44b1ba67df06deafa11926ec194874df85145a90 WatchSource:0}: Error finding container a8522713f4f271bf9a323c8d44b1ba67df06deafa11926ec194874df85145a90: Status 404 returned error can't find the container with id a8522713f4f271bf9a323c8d44b1ba67df06deafa11926ec194874df85145a90 Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.060957 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb"] Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.061966 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb" Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.064238 4794 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jfzjl" Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.099117 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" event={"ID":"c99d997c-3d17-4555-b2c9-65f58c71088b","Type":"ContainerStarted","Data":"a8522713f4f271bf9a323c8d44b1ba67df06deafa11926ec194874df85145a90"} Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.099503 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb"] Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.100774 4794 generic.go:334] "Generic (PLEG): container finished" podID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" containerID="33447506c97a1a1091f50f5b6a1871837e43f63a3eca3b22f34e64dd93459c19" exitCode=0 Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.100808 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfjvp" event={"ID":"1adc964b-5e7f-4685-ae6f-325531ddb5c8","Type":"ContainerDied","Data":"33447506c97a1a1091f50f5b6a1871837e43f63a3eca3b22f34e64dd93459c19"} Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.215244 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52533317-8c98-49ac-b92c-3b0684586408-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-sgfqb\" (UID: \"52533317-8c98-49ac-b92c-3b0684586408\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb" Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.215298 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8b7s\" (UniqueName: \"kubernetes.io/projected/52533317-8c98-49ac-b92c-3b0684586408-kube-api-access-r8b7s\") pod \"cert-manager-cainjector-855d9ccff4-sgfqb\" (UID: \"52533317-8c98-49ac-b92c-3b0684586408\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb" Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.316660 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52533317-8c98-49ac-b92c-3b0684586408-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-sgfqb\" (UID: \"52533317-8c98-49ac-b92c-3b0684586408\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb" Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.316711 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8b7s\" (UniqueName: \"kubernetes.io/projected/52533317-8c98-49ac-b92c-3b0684586408-kube-api-access-r8b7s\") pod \"cert-manager-cainjector-855d9ccff4-sgfqb\" (UID: \"52533317-8c98-49ac-b92c-3b0684586408\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb" Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.340319 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52533317-8c98-49ac-b92c-3b0684586408-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-sgfqb\" (UID: \"52533317-8c98-49ac-b92c-3b0684586408\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb" Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.352445 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8b7s\" (UniqueName: \"kubernetes.io/projected/52533317-8c98-49ac-b92c-3b0684586408-kube-api-access-r8b7s\") pod \"cert-manager-cainjector-855d9ccff4-sgfqb\" (UID: \"52533317-8c98-49ac-b92c-3b0684586408\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb" Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.379275 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb" Dec 15 14:10:07 crc kubenswrapper[4794]: I1215 14:10:07.869846 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb"] Dec 15 14:10:07 crc kubenswrapper[4794]: W1215 14:10:07.882741 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52533317_8c98_49ac_b92c_3b0684586408.slice/crio-9617cb343590d3a49a51592ad2eb9cd4873a94568cf3e3eb4e3f71bfbf8911cb WatchSource:0}: Error finding container 9617cb343590d3a49a51592ad2eb9cd4873a94568cf3e3eb4e3f71bfbf8911cb: Status 404 returned error can't find the container with id 9617cb343590d3a49a51592ad2eb9cd4873a94568cf3e3eb4e3f71bfbf8911cb Dec 15 14:10:08 crc kubenswrapper[4794]: I1215 14:10:08.114737 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb" event={"ID":"52533317-8c98-49ac-b92c-3b0684586408","Type":"ContainerStarted","Data":"9617cb343590d3a49a51592ad2eb9cd4873a94568cf3e3eb4e3f71bfbf8911cb"} Dec 15 14:10:09 crc kubenswrapper[4794]: I1215 14:10:09.143630 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfjvp" event={"ID":"1adc964b-5e7f-4685-ae6f-325531ddb5c8","Type":"ContainerStarted","Data":"0f767c8ba70786470c2826dd08d8addda93d9c8c3cc6e15f97d03647ffa24aa1"} Dec 15 14:10:09 crc kubenswrapper[4794]: I1215 14:10:09.187844 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sfjvp" podStartSLOduration=3.383283633 podStartE2EDuration="7.187829498s" podCreationTimestamp="2025-12-15 14:10:02 +0000 UTC" firstStartedPulling="2025-12-15 14:10:04.075988959 +0000 UTC m=+965.928011397" lastFinishedPulling="2025-12-15 14:10:07.880534824 +0000 UTC m=+969.732557262" observedRunningTime="2025-12-15 14:10:09.182455915 +0000 UTC m=+971.034478353" watchObservedRunningTime="2025-12-15 14:10:09.187829498 +0000 UTC m=+971.039851936" Dec 15 14:10:12 crc kubenswrapper[4794]: I1215 14:10:12.917084 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:12 crc kubenswrapper[4794]: I1215 14:10:12.917495 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:12 crc kubenswrapper[4794]: I1215 14:10:12.993344 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:13 crc kubenswrapper[4794]: I1215 14:10:13.291178 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:13 crc kubenswrapper[4794]: I1215 14:10:13.348420 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfjvp"] Dec 15 14:10:15 crc kubenswrapper[4794]: I1215 14:10:15.216022 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sfjvp" podUID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" containerName="registry-server" containerID="cri-o://0f767c8ba70786470c2826dd08d8addda93d9c8c3cc6e15f97d03647ffa24aa1" gracePeriod=2 Dec 15 14:10:16 crc kubenswrapper[4794]: I1215 14:10:16.229129 4794 generic.go:334] "Generic (PLEG): container finished" podID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" containerID="0f767c8ba70786470c2826dd08d8addda93d9c8c3cc6e15f97d03647ffa24aa1" exitCode=0 Dec 15 14:10:16 crc kubenswrapper[4794]: I1215 14:10:16.229207 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfjvp" event={"ID":"1adc964b-5e7f-4685-ae6f-325531ddb5c8","Type":"ContainerDied","Data":"0f767c8ba70786470c2826dd08d8addda93d9c8c3cc6e15f97d03647ffa24aa1"} Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.161196 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.241762 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" event={"ID":"c99d997c-3d17-4555-b2c9-65f58c71088b","Type":"ContainerStarted","Data":"f8ac2b0f96a016607610a3bbc570fbcf1696f1cf681935ea23d8bde219438ca2"} Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.242707 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.244438 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb" event={"ID":"52533317-8c98-49ac-b92c-3b0684586408","Type":"ContainerStarted","Data":"88b5497d386c95746d34f7b25cf35a0484b573232fd8c0c0d6ae787c93cd31c9"} Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.246565 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfjvp" event={"ID":"1adc964b-5e7f-4685-ae6f-325531ddb5c8","Type":"ContainerDied","Data":"408b51e01fdefce270556be91a11a34d7b8ac5073aef2ecedafe32d53034368a"} Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.246609 4794 scope.go:117] "RemoveContainer" containerID="0f767c8ba70786470c2826dd08d8addda93d9c8c3cc6e15f97d03647ffa24aa1" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.246618 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfjvp" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.260766 4794 scope.go:117] "RemoveContainer" containerID="33447506c97a1a1091f50f5b6a1871837e43f63a3eca3b22f34e64dd93459c19" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.269428 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" podStartSLOduration=1.569369933 podStartE2EDuration="13.269380439s" podCreationTimestamp="2025-12-15 14:10:05 +0000 UTC" firstStartedPulling="2025-12-15 14:10:06.279737302 +0000 UTC m=+968.131759740" lastFinishedPulling="2025-12-15 14:10:17.979747818 +0000 UTC m=+979.831770246" observedRunningTime="2025-12-15 14:10:18.258973065 +0000 UTC m=+980.110995523" watchObservedRunningTime="2025-12-15 14:10:18.269380439 +0000 UTC m=+980.121402887" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.292738 4794 scope.go:117] "RemoveContainer" containerID="ff2bd304f24019e0a45fbe5c446dcfad3e6f92be3bef4125630dee1b521ba1ec" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.312500 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-sgfqb" podStartSLOduration=1.186169197 podStartE2EDuration="11.312476456s" podCreationTimestamp="2025-12-15 14:10:07 +0000 UTC" firstStartedPulling="2025-12-15 14:10:07.883875699 +0000 UTC m=+969.735898137" lastFinishedPulling="2025-12-15 14:10:18.010182948 +0000 UTC m=+979.862205396" observedRunningTime="2025-12-15 14:10:18.280256186 +0000 UTC m=+980.132278644" watchObservedRunningTime="2025-12-15 14:10:18.312476456 +0000 UTC m=+980.164498894" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.314000 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-catalog-content\") pod \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.314052 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp585\" (UniqueName: \"kubernetes.io/projected/1adc964b-5e7f-4685-ae6f-325531ddb5c8-kube-api-access-sp585\") pod \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.314079 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-utilities\") pod \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\" (UID: \"1adc964b-5e7f-4685-ae6f-325531ddb5c8\") " Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.316168 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-utilities" (OuterVolumeSpecName: "utilities") pod "1adc964b-5e7f-4685-ae6f-325531ddb5c8" (UID: "1adc964b-5e7f-4685-ae6f-325531ddb5c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.325160 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1adc964b-5e7f-4685-ae6f-325531ddb5c8-kube-api-access-sp585" (OuterVolumeSpecName: "kube-api-access-sp585") pod "1adc964b-5e7f-4685-ae6f-325531ddb5c8" (UID: "1adc964b-5e7f-4685-ae6f-325531ddb5c8"). InnerVolumeSpecName "kube-api-access-sp585". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.343218 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1adc964b-5e7f-4685-ae6f-325531ddb5c8" (UID: "1adc964b-5e7f-4685-ae6f-325531ddb5c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.415202 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.415243 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp585\" (UniqueName: \"kubernetes.io/projected/1adc964b-5e7f-4685-ae6f-325531ddb5c8-kube-api-access-sp585\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.415260 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1adc964b-5e7f-4685-ae6f-325531ddb5c8-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.577439 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfjvp"] Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.583523 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfjvp"] Dec 15 14:10:18 crc kubenswrapper[4794]: I1215 14:10:18.745090 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" path="/var/lib/kubelet/pods/1adc964b-5e7f-4685-ae6f-325531ddb5c8/volumes" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.408683 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v6c4k"] Dec 15 14:10:20 crc kubenswrapper[4794]: E1215 14:10:20.409457 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" containerName="extract-content" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.409479 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" containerName="extract-content" Dec 15 14:10:20 crc kubenswrapper[4794]: E1215 14:10:20.409502 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" containerName="registry-server" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.409513 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" containerName="registry-server" Dec 15 14:10:20 crc kubenswrapper[4794]: E1215 14:10:20.409537 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" containerName="extract-utilities" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.409613 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" containerName="extract-utilities" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.409807 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1adc964b-5e7f-4685-ae6f-325531ddb5c8" containerName="registry-server" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.411157 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.439233 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v6c4k"] Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.441730 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-catalog-content\") pod \"certified-operators-v6c4k\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.441871 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqrzn\" (UniqueName: \"kubernetes.io/projected/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-kube-api-access-sqrzn\") pod \"certified-operators-v6c4k\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.441927 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-utilities\") pod \"certified-operators-v6c4k\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.543482 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-catalog-content\") pod \"certified-operators-v6c4k\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.543570 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqrzn\" (UniqueName: \"kubernetes.io/projected/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-kube-api-access-sqrzn\") pod \"certified-operators-v6c4k\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.543617 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-utilities\") pod \"certified-operators-v6c4k\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.544002 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-catalog-content\") pod \"certified-operators-v6c4k\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.544136 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-utilities\") pod \"certified-operators-v6c4k\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.576695 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqrzn\" (UniqueName: \"kubernetes.io/projected/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-kube-api-access-sqrzn\") pod \"certified-operators-v6c4k\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:20 crc kubenswrapper[4794]: I1215 14:10:20.734810 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:21 crc kubenswrapper[4794]: I1215 14:10:21.027892 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v6c4k"] Dec 15 14:10:21 crc kubenswrapper[4794]: W1215 14:10:21.033138 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d9bf45e_03cd_463f_8c8e_53cfa5e169f0.slice/crio-ff65b20093421d10b09b1d8e29a5a3f717f08ba4e238c862efc2c952076e731e WatchSource:0}: Error finding container ff65b20093421d10b09b1d8e29a5a3f717f08ba4e238c862efc2c952076e731e: Status 404 returned error can't find the container with id ff65b20093421d10b09b1d8e29a5a3f717f08ba4e238c862efc2c952076e731e Dec 15 14:10:21 crc kubenswrapper[4794]: I1215 14:10:21.269468 4794 generic.go:334] "Generic (PLEG): container finished" podID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" containerID="a472e35fdfdb0436031b2ceca3f87d1052a8b19acdc8b8bd9b7849d37ba5d947" exitCode=0 Dec 15 14:10:21 crc kubenswrapper[4794]: I1215 14:10:21.269524 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6c4k" event={"ID":"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0","Type":"ContainerDied","Data":"a472e35fdfdb0436031b2ceca3f87d1052a8b19acdc8b8bd9b7849d37ba5d947"} Dec 15 14:10:21 crc kubenswrapper[4794]: I1215 14:10:21.269561 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6c4k" event={"ID":"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0","Type":"ContainerStarted","Data":"ff65b20093421d10b09b1d8e29a5a3f717f08ba4e238c862efc2c952076e731e"} Dec 15 14:10:23 crc kubenswrapper[4794]: I1215 14:10:23.316220 4794 generic.go:334] "Generic (PLEG): container finished" podID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" containerID="729e9138d3efd96ce8ec8d0405d3e74656c92376451f74a4da7a115208fde474" exitCode=0 Dec 15 14:10:23 crc kubenswrapper[4794]: I1215 14:10:23.316839 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6c4k" event={"ID":"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0","Type":"ContainerDied","Data":"729e9138d3efd96ce8ec8d0405d3e74656c92376451f74a4da7a115208fde474"} Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.327323 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6c4k" event={"ID":"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0","Type":"ContainerStarted","Data":"c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201"} Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.353999 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v6c4k" podStartSLOduration=1.494625979 podStartE2EDuration="4.353973663s" podCreationTimestamp="2025-12-15 14:10:20 +0000 UTC" firstStartedPulling="2025-12-15 14:10:21.270770616 +0000 UTC m=+983.122793054" lastFinishedPulling="2025-12-15 14:10:24.1301183 +0000 UTC m=+985.982140738" observedRunningTime="2025-12-15 14:10:24.350170656 +0000 UTC m=+986.202193094" watchObservedRunningTime="2025-12-15 14:10:24.353973663 +0000 UTC m=+986.205996111" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.528039 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-d9dgk"] Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.529111 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-d9dgk" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.532189 4794 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qb847" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.533634 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.533694 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.533742 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.534392 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"412df8a876437db1b8dca58aacf9cc551c4e0ee66a2d4593a2815bb14d689e05"} pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.534471 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" containerID="cri-o://412df8a876437db1b8dca58aacf9cc551c4e0ee66a2d4593a2815bb14d689e05" gracePeriod=600 Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.545996 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-d9dgk"] Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.599504 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fm95\" (UniqueName: \"kubernetes.io/projected/e6eea6f6-edca-4fad-9a5a-b5af09663e17-kube-api-access-4fm95\") pod \"cert-manager-86cb77c54b-d9dgk\" (UID: \"e6eea6f6-edca-4fad-9a5a-b5af09663e17\") " pod="cert-manager/cert-manager-86cb77c54b-d9dgk" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.599648 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6eea6f6-edca-4fad-9a5a-b5af09663e17-bound-sa-token\") pod \"cert-manager-86cb77c54b-d9dgk\" (UID: \"e6eea6f6-edca-4fad-9a5a-b5af09663e17\") " pod="cert-manager/cert-manager-86cb77c54b-d9dgk" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.701216 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fm95\" (UniqueName: \"kubernetes.io/projected/e6eea6f6-edca-4fad-9a5a-b5af09663e17-kube-api-access-4fm95\") pod \"cert-manager-86cb77c54b-d9dgk\" (UID: \"e6eea6f6-edca-4fad-9a5a-b5af09663e17\") " pod="cert-manager/cert-manager-86cb77c54b-d9dgk" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.701671 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6eea6f6-edca-4fad-9a5a-b5af09663e17-bound-sa-token\") pod \"cert-manager-86cb77c54b-d9dgk\" (UID: \"e6eea6f6-edca-4fad-9a5a-b5af09663e17\") " pod="cert-manager/cert-manager-86cb77c54b-d9dgk" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.729870 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6eea6f6-edca-4fad-9a5a-b5af09663e17-bound-sa-token\") pod \"cert-manager-86cb77c54b-d9dgk\" (UID: \"e6eea6f6-edca-4fad-9a5a-b5af09663e17\") " pod="cert-manager/cert-manager-86cb77c54b-d9dgk" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.732489 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fm95\" (UniqueName: \"kubernetes.io/projected/e6eea6f6-edca-4fad-9a5a-b5af09663e17-kube-api-access-4fm95\") pod \"cert-manager-86cb77c54b-d9dgk\" (UID: \"e6eea6f6-edca-4fad-9a5a-b5af09663e17\") " pod="cert-manager/cert-manager-86cb77c54b-d9dgk" Dec 15 14:10:24 crc kubenswrapper[4794]: I1215 14:10:24.862020 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-d9dgk" Dec 15 14:10:25 crc kubenswrapper[4794]: I1215 14:10:25.336555 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-d9dgk"] Dec 15 14:10:25 crc kubenswrapper[4794]: W1215 14:10:25.339420 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6eea6f6_edca_4fad_9a5a_b5af09663e17.slice/crio-411cf8b54b786a8a16cd5b159d4a0ddc204b2cc0996649a4aeadd45686a1fb5d WatchSource:0}: Error finding container 411cf8b54b786a8a16cd5b159d4a0ddc204b2cc0996649a4aeadd45686a1fb5d: Status 404 returned error can't find the container with id 411cf8b54b786a8a16cd5b159d4a0ddc204b2cc0996649a4aeadd45686a1fb5d Dec 15 14:10:25 crc kubenswrapper[4794]: I1215 14:10:25.355042 4794 generic.go:334] "Generic (PLEG): container finished" podID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerID="412df8a876437db1b8dca58aacf9cc551c4e0ee66a2d4593a2815bb14d689e05" exitCode=0 Dec 15 14:10:25 crc kubenswrapper[4794]: I1215 14:10:25.355125 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerDied","Data":"412df8a876437db1b8dca58aacf9cc551c4e0ee66a2d4593a2815bb14d689e05"} Dec 15 14:10:25 crc kubenswrapper[4794]: I1215 14:10:25.355190 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"69cd9719316ef2e80637a18253154a420b07d0d2cd576a13b5c600dba7da07e9"} Dec 15 14:10:25 crc kubenswrapper[4794]: I1215 14:10:25.355223 4794 scope.go:117] "RemoveContainer" containerID="abdd64a3133374cb12ad4cc9a75da62f63e41a60a16866f7f2bcba98242002ca" Dec 15 14:10:25 crc kubenswrapper[4794]: I1215 14:10:25.990683 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-b67fp" Dec 15 14:10:26 crc kubenswrapper[4794]: I1215 14:10:26.363958 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-d9dgk" event={"ID":"e6eea6f6-edca-4fad-9a5a-b5af09663e17","Type":"ContainerStarted","Data":"9f5abfa7c7521481ac025e73a97ed6397393e8cd6dc476e1dd7ad8f159f161e5"} Dec 15 14:10:26 crc kubenswrapper[4794]: I1215 14:10:26.363994 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-d9dgk" event={"ID":"e6eea6f6-edca-4fad-9a5a-b5af09663e17","Type":"ContainerStarted","Data":"411cf8b54b786a8a16cd5b159d4a0ddc204b2cc0996649a4aeadd45686a1fb5d"} Dec 15 14:10:26 crc kubenswrapper[4794]: I1215 14:10:26.386376 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-d9dgk" podStartSLOduration=2.386350699 podStartE2EDuration="2.386350699s" podCreationTimestamp="2025-12-15 14:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:10:26.380163284 +0000 UTC m=+988.232185732" watchObservedRunningTime="2025-12-15 14:10:26.386350699 +0000 UTC m=+988.238373157" Dec 15 14:10:29 crc kubenswrapper[4794]: I1215 14:10:29.315823 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nwvw4"] Dec 15 14:10:29 crc kubenswrapper[4794]: I1215 14:10:29.317234 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nwvw4" Dec 15 14:10:29 crc kubenswrapper[4794]: I1215 14:10:29.324107 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 15 14:10:29 crc kubenswrapper[4794]: I1215 14:10:29.324330 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 15 14:10:29 crc kubenswrapper[4794]: I1215 14:10:29.325101 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-mmntc" Dec 15 14:10:29 crc kubenswrapper[4794]: I1215 14:10:29.332757 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nwvw4"] Dec 15 14:10:29 crc kubenswrapper[4794]: I1215 14:10:29.359285 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghb6f\" (UniqueName: \"kubernetes.io/projected/8eebdde5-3191-4fcc-b715-ea239b97d7bc-kube-api-access-ghb6f\") pod \"openstack-operator-index-nwvw4\" (UID: \"8eebdde5-3191-4fcc-b715-ea239b97d7bc\") " pod="openstack-operators/openstack-operator-index-nwvw4" Dec 15 14:10:29 crc kubenswrapper[4794]: I1215 14:10:29.461177 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghb6f\" (UniqueName: \"kubernetes.io/projected/8eebdde5-3191-4fcc-b715-ea239b97d7bc-kube-api-access-ghb6f\") pod \"openstack-operator-index-nwvw4\" (UID: \"8eebdde5-3191-4fcc-b715-ea239b97d7bc\") " pod="openstack-operators/openstack-operator-index-nwvw4" Dec 15 14:10:29 crc kubenswrapper[4794]: I1215 14:10:29.483790 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghb6f\" (UniqueName: \"kubernetes.io/projected/8eebdde5-3191-4fcc-b715-ea239b97d7bc-kube-api-access-ghb6f\") pod \"openstack-operator-index-nwvw4\" (UID: \"8eebdde5-3191-4fcc-b715-ea239b97d7bc\") " pod="openstack-operators/openstack-operator-index-nwvw4" Dec 15 14:10:29 crc kubenswrapper[4794]: I1215 14:10:29.637071 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nwvw4" Dec 15 14:10:30 crc kubenswrapper[4794]: I1215 14:10:30.159198 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nwvw4"] Dec 15 14:10:30 crc kubenswrapper[4794]: W1215 14:10:30.168199 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eebdde5_3191_4fcc_b715_ea239b97d7bc.slice/crio-dc79226b6874368b40f167b6b392a7e774383c7ce63e6059312dd242d3ff9d51 WatchSource:0}: Error finding container dc79226b6874368b40f167b6b392a7e774383c7ce63e6059312dd242d3ff9d51: Status 404 returned error can't find the container with id dc79226b6874368b40f167b6b392a7e774383c7ce63e6059312dd242d3ff9d51 Dec 15 14:10:30 crc kubenswrapper[4794]: I1215 14:10:30.392322 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nwvw4" event={"ID":"8eebdde5-3191-4fcc-b715-ea239b97d7bc","Type":"ContainerStarted","Data":"dc79226b6874368b40f167b6b392a7e774383c7ce63e6059312dd242d3ff9d51"} Dec 15 14:10:30 crc kubenswrapper[4794]: I1215 14:10:30.739795 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:30 crc kubenswrapper[4794]: I1215 14:10:30.747877 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:30 crc kubenswrapper[4794]: I1215 14:10:30.805143 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:31 crc kubenswrapper[4794]: I1215 14:10:31.446437 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:33 crc kubenswrapper[4794]: I1215 14:10:33.298538 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nwvw4"] Dec 15 14:10:34 crc kubenswrapper[4794]: I1215 14:10:34.103080 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nqbjm"] Dec 15 14:10:34 crc kubenswrapper[4794]: I1215 14:10:34.104278 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nqbjm" Dec 15 14:10:34 crc kubenswrapper[4794]: I1215 14:10:34.125284 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nqbjm"] Dec 15 14:10:34 crc kubenswrapper[4794]: I1215 14:10:34.228655 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-557zr\" (UniqueName: \"kubernetes.io/projected/da21a213-5a5f-4ce3-a3a4-c0579e46a726-kube-api-access-557zr\") pod \"openstack-operator-index-nqbjm\" (UID: \"da21a213-5a5f-4ce3-a3a4-c0579e46a726\") " pod="openstack-operators/openstack-operator-index-nqbjm" Dec 15 14:10:34 crc kubenswrapper[4794]: I1215 14:10:34.329788 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-557zr\" (UniqueName: \"kubernetes.io/projected/da21a213-5a5f-4ce3-a3a4-c0579e46a726-kube-api-access-557zr\") pod \"openstack-operator-index-nqbjm\" (UID: \"da21a213-5a5f-4ce3-a3a4-c0579e46a726\") " pod="openstack-operators/openstack-operator-index-nqbjm" Dec 15 14:10:34 crc kubenswrapper[4794]: I1215 14:10:34.366658 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-557zr\" (UniqueName: \"kubernetes.io/projected/da21a213-5a5f-4ce3-a3a4-c0579e46a726-kube-api-access-557zr\") pod \"openstack-operator-index-nqbjm\" (UID: \"da21a213-5a5f-4ce3-a3a4-c0579e46a726\") " pod="openstack-operators/openstack-operator-index-nqbjm" Dec 15 14:10:34 crc kubenswrapper[4794]: I1215 14:10:34.422605 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nwvw4" event={"ID":"8eebdde5-3191-4fcc-b715-ea239b97d7bc","Type":"ContainerStarted","Data":"f7bf76cabd5ba2ebc736da3a568c4c0e0522b3223593274eb6104b0f93afbfa3"} Dec 15 14:10:34 crc kubenswrapper[4794]: I1215 14:10:34.423022 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-nwvw4" podUID="8eebdde5-3191-4fcc-b715-ea239b97d7bc" containerName="registry-server" containerID="cri-o://f7bf76cabd5ba2ebc736da3a568c4c0e0522b3223593274eb6104b0f93afbfa3" gracePeriod=2 Dec 15 14:10:34 crc kubenswrapper[4794]: I1215 14:10:34.431728 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nqbjm" Dec 15 14:10:34 crc kubenswrapper[4794]: I1215 14:10:34.440744 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nwvw4" podStartSLOduration=1.9672326500000001 podStartE2EDuration="5.440728441s" podCreationTimestamp="2025-12-15 14:10:29 +0000 UTC" firstStartedPulling="2025-12-15 14:10:30.171062201 +0000 UTC m=+992.023084639" lastFinishedPulling="2025-12-15 14:10:33.644557992 +0000 UTC m=+995.496580430" observedRunningTime="2025-12-15 14:10:34.436423719 +0000 UTC m=+996.288446167" watchObservedRunningTime="2025-12-15 14:10:34.440728441 +0000 UTC m=+996.292750879" Dec 15 14:10:34 crc kubenswrapper[4794]: I1215 14:10:34.895558 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nqbjm"] Dec 15 14:10:35 crc kubenswrapper[4794]: I1215 14:10:35.449790 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nqbjm" event={"ID":"da21a213-5a5f-4ce3-a3a4-c0579e46a726","Type":"ContainerStarted","Data":"0050c2e44cded1b16be2ab7fe7048ff94071ec4f62fd3e707fc389f2cb8aa1bf"} Dec 15 14:10:35 crc kubenswrapper[4794]: I1215 14:10:35.451394 4794 generic.go:334] "Generic (PLEG): container finished" podID="8eebdde5-3191-4fcc-b715-ea239b97d7bc" containerID="f7bf76cabd5ba2ebc736da3a568c4c0e0522b3223593274eb6104b0f93afbfa3" exitCode=0 Dec 15 14:10:35 crc kubenswrapper[4794]: I1215 14:10:35.451420 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nwvw4" event={"ID":"8eebdde5-3191-4fcc-b715-ea239b97d7bc","Type":"ContainerDied","Data":"f7bf76cabd5ba2ebc736da3a568c4c0e0522b3223593274eb6104b0f93afbfa3"} Dec 15 14:10:35 crc kubenswrapper[4794]: I1215 14:10:35.488490 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v6c4k"] Dec 15 14:10:35 crc kubenswrapper[4794]: I1215 14:10:35.488892 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v6c4k" podUID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" containerName="registry-server" containerID="cri-o://c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201" gracePeriod=2 Dec 15 14:10:35 crc kubenswrapper[4794]: I1215 14:10:35.754046 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nwvw4" Dec 15 14:10:35 crc kubenswrapper[4794]: I1215 14:10:35.850408 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghb6f\" (UniqueName: \"kubernetes.io/projected/8eebdde5-3191-4fcc-b715-ea239b97d7bc-kube-api-access-ghb6f\") pod \"8eebdde5-3191-4fcc-b715-ea239b97d7bc\" (UID: \"8eebdde5-3191-4fcc-b715-ea239b97d7bc\") " Dec 15 14:10:35 crc kubenswrapper[4794]: I1215 14:10:35.856342 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eebdde5-3191-4fcc-b715-ea239b97d7bc-kube-api-access-ghb6f" (OuterVolumeSpecName: "kube-api-access-ghb6f") pod "8eebdde5-3191-4fcc-b715-ea239b97d7bc" (UID: "8eebdde5-3191-4fcc-b715-ea239b97d7bc"). InnerVolumeSpecName "kube-api-access-ghb6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:10:35 crc kubenswrapper[4794]: I1215 14:10:35.920336 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:35 crc kubenswrapper[4794]: I1215 14:10:35.951864 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghb6f\" (UniqueName: \"kubernetes.io/projected/8eebdde5-3191-4fcc-b715-ea239b97d7bc-kube-api-access-ghb6f\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.052807 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-utilities\") pod \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.053249 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-catalog-content\") pod \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.054523 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-utilities" (OuterVolumeSpecName: "utilities") pod "5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" (UID: "5d9bf45e-03cd-463f-8c8e-53cfa5e169f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.063293 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqrzn\" (UniqueName: \"kubernetes.io/projected/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-kube-api-access-sqrzn\") pod \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\" (UID: \"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0\") " Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.063718 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.067146 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-kube-api-access-sqrzn" (OuterVolumeSpecName: "kube-api-access-sqrzn") pod "5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" (UID: "5d9bf45e-03cd-463f-8c8e-53cfa5e169f0"). InnerVolumeSpecName "kube-api-access-sqrzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.103291 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" (UID: "5d9bf45e-03cd-463f-8c8e-53cfa5e169f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.165380 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqrzn\" (UniqueName: \"kubernetes.io/projected/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-kube-api-access-sqrzn\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.165666 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.459492 4794 generic.go:334] "Generic (PLEG): container finished" podID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" containerID="c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201" exitCode=0 Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.459557 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6c4k" event={"ID":"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0","Type":"ContainerDied","Data":"c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201"} Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.459613 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6c4k" event={"ID":"5d9bf45e-03cd-463f-8c8e-53cfa5e169f0","Type":"ContainerDied","Data":"ff65b20093421d10b09b1d8e29a5a3f717f08ba4e238c862efc2c952076e731e"} Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.459634 4794 scope.go:117] "RemoveContainer" containerID="c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.459976 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6c4k" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.461342 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nwvw4" event={"ID":"8eebdde5-3191-4fcc-b715-ea239b97d7bc","Type":"ContainerDied","Data":"dc79226b6874368b40f167b6b392a7e774383c7ce63e6059312dd242d3ff9d51"} Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.461352 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nwvw4" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.467918 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nqbjm" event={"ID":"da21a213-5a5f-4ce3-a3a4-c0579e46a726","Type":"ContainerStarted","Data":"f825dfd68cd4333405e75558ffe440e4769be4e2e943f804102d3ff8c2bb5f56"} Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.486069 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nqbjm" podStartSLOduration=1.883686827 podStartE2EDuration="2.486052402s" podCreationTimestamp="2025-12-15 14:10:34 +0000 UTC" firstStartedPulling="2025-12-15 14:10:34.90981594 +0000 UTC m=+996.761838418" lastFinishedPulling="2025-12-15 14:10:35.512181555 +0000 UTC m=+997.364203993" observedRunningTime="2025-12-15 14:10:36.481824192 +0000 UTC m=+998.333846680" watchObservedRunningTime="2025-12-15 14:10:36.486052402 +0000 UTC m=+998.338074840" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.498317 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v6c4k"] Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.506627 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v6c4k"] Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.512570 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nwvw4"] Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.517549 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-nwvw4"] Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.641392 4794 scope.go:117] "RemoveContainer" containerID="729e9138d3efd96ce8ec8d0405d3e74656c92376451f74a4da7a115208fde474" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.665700 4794 scope.go:117] "RemoveContainer" containerID="a472e35fdfdb0436031b2ceca3f87d1052a8b19acdc8b8bd9b7849d37ba5d947" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.687907 4794 scope.go:117] "RemoveContainer" containerID="c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201" Dec 15 14:10:36 crc kubenswrapper[4794]: E1215 14:10:36.689021 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201\": container with ID starting with c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201 not found: ID does not exist" containerID="c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.689123 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201"} err="failed to get container status \"c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201\": rpc error: code = NotFound desc = could not find container \"c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201\": container with ID starting with c4e1729fca078dbf0e39278acf8d8f18e1dd4cebe65b35c9e03b4c3c0375e201 not found: ID does not exist" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.689280 4794 scope.go:117] "RemoveContainer" containerID="729e9138d3efd96ce8ec8d0405d3e74656c92376451f74a4da7a115208fde474" Dec 15 14:10:36 crc kubenswrapper[4794]: E1215 14:10:36.689887 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"729e9138d3efd96ce8ec8d0405d3e74656c92376451f74a4da7a115208fde474\": container with ID starting with 729e9138d3efd96ce8ec8d0405d3e74656c92376451f74a4da7a115208fde474 not found: ID does not exist" containerID="729e9138d3efd96ce8ec8d0405d3e74656c92376451f74a4da7a115208fde474" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.689934 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"729e9138d3efd96ce8ec8d0405d3e74656c92376451f74a4da7a115208fde474"} err="failed to get container status \"729e9138d3efd96ce8ec8d0405d3e74656c92376451f74a4da7a115208fde474\": rpc error: code = NotFound desc = could not find container \"729e9138d3efd96ce8ec8d0405d3e74656c92376451f74a4da7a115208fde474\": container with ID starting with 729e9138d3efd96ce8ec8d0405d3e74656c92376451f74a4da7a115208fde474 not found: ID does not exist" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.689955 4794 scope.go:117] "RemoveContainer" containerID="a472e35fdfdb0436031b2ceca3f87d1052a8b19acdc8b8bd9b7849d37ba5d947" Dec 15 14:10:36 crc kubenswrapper[4794]: E1215 14:10:36.690346 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a472e35fdfdb0436031b2ceca3f87d1052a8b19acdc8b8bd9b7849d37ba5d947\": container with ID starting with a472e35fdfdb0436031b2ceca3f87d1052a8b19acdc8b8bd9b7849d37ba5d947 not found: ID does not exist" containerID="a472e35fdfdb0436031b2ceca3f87d1052a8b19acdc8b8bd9b7849d37ba5d947" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.690374 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a472e35fdfdb0436031b2ceca3f87d1052a8b19acdc8b8bd9b7849d37ba5d947"} err="failed to get container status \"a472e35fdfdb0436031b2ceca3f87d1052a8b19acdc8b8bd9b7849d37ba5d947\": rpc error: code = NotFound desc = could not find container \"a472e35fdfdb0436031b2ceca3f87d1052a8b19acdc8b8bd9b7849d37ba5d947\": container with ID starting with a472e35fdfdb0436031b2ceca3f87d1052a8b19acdc8b8bd9b7849d37ba5d947 not found: ID does not exist" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.690429 4794 scope.go:117] "RemoveContainer" containerID="f7bf76cabd5ba2ebc736da3a568c4c0e0522b3223593274eb6104b0f93afbfa3" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.743231 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" path="/var/lib/kubelet/pods/5d9bf45e-03cd-463f-8c8e-53cfa5e169f0/volumes" Dec 15 14:10:36 crc kubenswrapper[4794]: I1215 14:10:36.743789 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eebdde5-3191-4fcc-b715-ea239b97d7bc" path="/var/lib/kubelet/pods/8eebdde5-3191-4fcc-b715-ea239b97d7bc/volumes" Dec 15 14:10:44 crc kubenswrapper[4794]: I1215 14:10:44.432566 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-nqbjm" Dec 15 14:10:44 crc kubenswrapper[4794]: I1215 14:10:44.433292 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-nqbjm" Dec 15 14:10:44 crc kubenswrapper[4794]: I1215 14:10:44.478378 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-nqbjm" Dec 15 14:10:44 crc kubenswrapper[4794]: I1215 14:10:44.584828 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-nqbjm" Dec 15 14:10:46 crc kubenswrapper[4794]: I1215 14:10:46.963800 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz"] Dec 15 14:10:46 crc kubenswrapper[4794]: E1215 14:10:46.964485 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" containerName="extract-content" Dec 15 14:10:46 crc kubenswrapper[4794]: I1215 14:10:46.964505 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" containerName="extract-content" Dec 15 14:10:46 crc kubenswrapper[4794]: E1215 14:10:46.964538 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eebdde5-3191-4fcc-b715-ea239b97d7bc" containerName="registry-server" Dec 15 14:10:46 crc kubenswrapper[4794]: I1215 14:10:46.964551 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eebdde5-3191-4fcc-b715-ea239b97d7bc" containerName="registry-server" Dec 15 14:10:46 crc kubenswrapper[4794]: E1215 14:10:46.964575 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" containerName="registry-server" Dec 15 14:10:46 crc kubenswrapper[4794]: I1215 14:10:46.964617 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" containerName="registry-server" Dec 15 14:10:46 crc kubenswrapper[4794]: E1215 14:10:46.964648 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" containerName="extract-utilities" Dec 15 14:10:46 crc kubenswrapper[4794]: I1215 14:10:46.964660 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" containerName="extract-utilities" Dec 15 14:10:46 crc kubenswrapper[4794]: I1215 14:10:46.964855 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9bf45e-03cd-463f-8c8e-53cfa5e169f0" containerName="registry-server" Dec 15 14:10:46 crc kubenswrapper[4794]: I1215 14:10:46.964879 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eebdde5-3191-4fcc-b715-ea239b97d7bc" containerName="registry-server" Dec 15 14:10:46 crc kubenswrapper[4794]: I1215 14:10:46.966951 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:46 crc kubenswrapper[4794]: I1215 14:10:46.972175 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9nhp5" Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:46.985575 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz"] Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:47.016447 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rkzv\" (UniqueName: \"kubernetes.io/projected/91c52bd8-479c-4fee-bd2c-f46432f75395-kube-api-access-7rkzv\") pod \"7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:47.018094 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-bundle\") pod \"7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:47.018209 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-util\") pod \"7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:47.119864 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-bundle\") pod \"7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:47.119988 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-util\") pod \"7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:47.120070 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rkzv\" (UniqueName: \"kubernetes.io/projected/91c52bd8-479c-4fee-bd2c-f46432f75395-kube-api-access-7rkzv\") pod \"7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:47.121185 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-bundle\") pod \"7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:47.121290 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-util\") pod \"7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:47.143633 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rkzv\" (UniqueName: \"kubernetes.io/projected/91c52bd8-479c-4fee-bd2c-f46432f75395-kube-api-access-7rkzv\") pod \"7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:47.331167 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:47 crc kubenswrapper[4794]: I1215 14:10:47.830633 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz"] Dec 15 14:10:48 crc kubenswrapper[4794]: I1215 14:10:48.576215 4794 generic.go:334] "Generic (PLEG): container finished" podID="91c52bd8-479c-4fee-bd2c-f46432f75395" containerID="c7bed2d3688d85be304ba64e04ee3e0afe5a0d38de0295fa4fe45bdd824cdb30" exitCode=0 Dec 15 14:10:48 crc kubenswrapper[4794]: I1215 14:10:48.576319 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" event={"ID":"91c52bd8-479c-4fee-bd2c-f46432f75395","Type":"ContainerDied","Data":"c7bed2d3688d85be304ba64e04ee3e0afe5a0d38de0295fa4fe45bdd824cdb30"} Dec 15 14:10:48 crc kubenswrapper[4794]: I1215 14:10:48.576672 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" event={"ID":"91c52bd8-479c-4fee-bd2c-f46432f75395","Type":"ContainerStarted","Data":"bf2e644e96763c71ec35f6d7affa6e65644ddc8573c2624c76377baec5a168fa"} Dec 15 14:10:49 crc kubenswrapper[4794]: I1215 14:10:49.588572 4794 generic.go:334] "Generic (PLEG): container finished" podID="91c52bd8-479c-4fee-bd2c-f46432f75395" containerID="f6cbb359d7d0213271e517b39496d48df6db9d683deaaded626cc7947be83dd0" exitCode=0 Dec 15 14:10:49 crc kubenswrapper[4794]: I1215 14:10:49.588755 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" event={"ID":"91c52bd8-479c-4fee-bd2c-f46432f75395","Type":"ContainerDied","Data":"f6cbb359d7d0213271e517b39496d48df6db9d683deaaded626cc7947be83dd0"} Dec 15 14:10:50 crc kubenswrapper[4794]: I1215 14:10:50.600813 4794 generic.go:334] "Generic (PLEG): container finished" podID="91c52bd8-479c-4fee-bd2c-f46432f75395" containerID="5dbd5e45200c00d6f08bfd75f338f04bab8aefd252657fde7d5953e523cfb27a" exitCode=0 Dec 15 14:10:50 crc kubenswrapper[4794]: I1215 14:10:50.600913 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" event={"ID":"91c52bd8-479c-4fee-bd2c-f46432f75395","Type":"ContainerDied","Data":"5dbd5e45200c00d6f08bfd75f338f04bab8aefd252657fde7d5953e523cfb27a"} Dec 15 14:10:51 crc kubenswrapper[4794]: I1215 14:10:51.949357 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:51 crc kubenswrapper[4794]: I1215 14:10:51.998709 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rkzv\" (UniqueName: \"kubernetes.io/projected/91c52bd8-479c-4fee-bd2c-f46432f75395-kube-api-access-7rkzv\") pod \"91c52bd8-479c-4fee-bd2c-f46432f75395\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " Dec 15 14:10:51 crc kubenswrapper[4794]: I1215 14:10:51.998781 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-bundle\") pod \"91c52bd8-479c-4fee-bd2c-f46432f75395\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " Dec 15 14:10:51 crc kubenswrapper[4794]: I1215 14:10:51.998813 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-util\") pod \"91c52bd8-479c-4fee-bd2c-f46432f75395\" (UID: \"91c52bd8-479c-4fee-bd2c-f46432f75395\") " Dec 15 14:10:51 crc kubenswrapper[4794]: I1215 14:10:51.999895 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-bundle" (OuterVolumeSpecName: "bundle") pod "91c52bd8-479c-4fee-bd2c-f46432f75395" (UID: "91c52bd8-479c-4fee-bd2c-f46432f75395"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:10:52 crc kubenswrapper[4794]: I1215 14:10:52.007427 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c52bd8-479c-4fee-bd2c-f46432f75395-kube-api-access-7rkzv" (OuterVolumeSpecName: "kube-api-access-7rkzv") pod "91c52bd8-479c-4fee-bd2c-f46432f75395" (UID: "91c52bd8-479c-4fee-bd2c-f46432f75395"). InnerVolumeSpecName "kube-api-access-7rkzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:10:52 crc kubenswrapper[4794]: I1215 14:10:52.012456 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-util" (OuterVolumeSpecName: "util") pod "91c52bd8-479c-4fee-bd2c-f46432f75395" (UID: "91c52bd8-479c-4fee-bd2c-f46432f75395"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:10:52 crc kubenswrapper[4794]: I1215 14:10:52.100213 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rkzv\" (UniqueName: \"kubernetes.io/projected/91c52bd8-479c-4fee-bd2c-f46432f75395-kube-api-access-7rkzv\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:52 crc kubenswrapper[4794]: I1215 14:10:52.100526 4794 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:52 crc kubenswrapper[4794]: I1215 14:10:52.100735 4794 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91c52bd8-479c-4fee-bd2c-f46432f75395-util\") on node \"crc\" DevicePath \"\"" Dec 15 14:10:52 crc kubenswrapper[4794]: I1215 14:10:52.621343 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" event={"ID":"91c52bd8-479c-4fee-bd2c-f46432f75395","Type":"ContainerDied","Data":"bf2e644e96763c71ec35f6d7affa6e65644ddc8573c2624c76377baec5a168fa"} Dec 15 14:10:52 crc kubenswrapper[4794]: I1215 14:10:52.621403 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf2e644e96763c71ec35f6d7affa6e65644ddc8573c2624c76377baec5a168fa" Dec 15 14:10:52 crc kubenswrapper[4794]: I1215 14:10:52.621468 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz" Dec 15 14:10:56 crc kubenswrapper[4794]: I1215 14:10:56.848317 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5"] Dec 15 14:10:56 crc kubenswrapper[4794]: E1215 14:10:56.849334 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c52bd8-479c-4fee-bd2c-f46432f75395" containerName="extract" Dec 15 14:10:56 crc kubenswrapper[4794]: I1215 14:10:56.849348 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c52bd8-479c-4fee-bd2c-f46432f75395" containerName="extract" Dec 15 14:10:56 crc kubenswrapper[4794]: E1215 14:10:56.849366 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c52bd8-479c-4fee-bd2c-f46432f75395" containerName="util" Dec 15 14:10:56 crc kubenswrapper[4794]: I1215 14:10:56.849373 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c52bd8-479c-4fee-bd2c-f46432f75395" containerName="util" Dec 15 14:10:56 crc kubenswrapper[4794]: E1215 14:10:56.849387 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c52bd8-479c-4fee-bd2c-f46432f75395" containerName="pull" Dec 15 14:10:56 crc kubenswrapper[4794]: I1215 14:10:56.849397 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c52bd8-479c-4fee-bd2c-f46432f75395" containerName="pull" Dec 15 14:10:56 crc kubenswrapper[4794]: I1215 14:10:56.849540 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c52bd8-479c-4fee-bd2c-f46432f75395" containerName="extract" Dec 15 14:10:56 crc kubenswrapper[4794]: I1215 14:10:56.850354 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" Dec 15 14:10:56 crc kubenswrapper[4794]: I1215 14:10:56.852441 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-sc5kw" Dec 15 14:10:56 crc kubenswrapper[4794]: I1215 14:10:56.875264 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8xmk\" (UniqueName: \"kubernetes.io/projected/46324c1c-ee38-4acf-b149-6a6a970e0df4-kube-api-access-b8xmk\") pod \"openstack-operator-controller-operator-59d975664b-tfxz5\" (UID: \"46324c1c-ee38-4acf-b149-6a6a970e0df4\") " pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" Dec 15 14:10:56 crc kubenswrapper[4794]: I1215 14:10:56.935621 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5"] Dec 15 14:10:56 crc kubenswrapper[4794]: I1215 14:10:56.976301 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8xmk\" (UniqueName: \"kubernetes.io/projected/46324c1c-ee38-4acf-b149-6a6a970e0df4-kube-api-access-b8xmk\") pod \"openstack-operator-controller-operator-59d975664b-tfxz5\" (UID: \"46324c1c-ee38-4acf-b149-6a6a970e0df4\") " pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" Dec 15 14:10:56 crc kubenswrapper[4794]: I1215 14:10:56.997409 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8xmk\" (UniqueName: \"kubernetes.io/projected/46324c1c-ee38-4acf-b149-6a6a970e0df4-kube-api-access-b8xmk\") pod \"openstack-operator-controller-operator-59d975664b-tfxz5\" (UID: \"46324c1c-ee38-4acf-b149-6a6a970e0df4\") " pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" Dec 15 14:10:57 crc kubenswrapper[4794]: I1215 14:10:57.170931 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" Dec 15 14:10:57 crc kubenswrapper[4794]: I1215 14:10:57.633193 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5"] Dec 15 14:10:57 crc kubenswrapper[4794]: I1215 14:10:57.660987 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" event={"ID":"46324c1c-ee38-4acf-b149-6a6a970e0df4","Type":"ContainerStarted","Data":"229b451dfdd8da2122e1ec0410df7543b294ef65eec34d5a530fc3c42a9af507"} Dec 15 14:11:01 crc kubenswrapper[4794]: I1215 14:11:01.690356 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" event={"ID":"46324c1c-ee38-4acf-b149-6a6a970e0df4","Type":"ContainerStarted","Data":"81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079"} Dec 15 14:11:03 crc kubenswrapper[4794]: I1215 14:11:03.705817 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" event={"ID":"46324c1c-ee38-4acf-b149-6a6a970e0df4","Type":"ContainerStarted","Data":"de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c"} Dec 15 14:11:03 crc kubenswrapper[4794]: I1215 14:11:03.706360 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" Dec 15 14:11:03 crc kubenswrapper[4794]: I1215 14:11:03.756380 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" podStartSLOduration=1.893650614 podStartE2EDuration="7.756355361s" podCreationTimestamp="2025-12-15 14:10:56 +0000 UTC" firstStartedPulling="2025-12-15 14:10:57.631395307 +0000 UTC m=+1019.483417755" lastFinishedPulling="2025-12-15 14:11:03.494100024 +0000 UTC m=+1025.346122502" observedRunningTime="2025-12-15 14:11:03.746154883 +0000 UTC m=+1025.598177351" watchObservedRunningTime="2025-12-15 14:11:03.756355361 +0000 UTC m=+1025.608377829" Dec 15 14:11:07 crc kubenswrapper[4794]: I1215 14:11:07.174866 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.313058 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.315011 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.316822 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6xsff" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.317144 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.318783 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.321188 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.322143 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.324155 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-66mc9" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.326076 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-c7zhr" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.333748 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.340225 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.353619 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.354979 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.356769 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-wchct" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.362533 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.369693 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.370858 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.378367 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-f6x8v" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.382637 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.397336 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.424887 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.426143 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.433277 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.434350 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.436080 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-bx2kp" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.436172 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.436348 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-lxmj2" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.481702 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.504401 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.522673 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.522944 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzpgq\" (UniqueName: \"kubernetes.io/projected/02e2d0b6-2d72-4aa5-9a71-2c0fb9fcf262-kube-api-access-jzpgq\") pod \"designate-operator-controller-manager-69977bdf55-4tl5z\" (UID: \"02e2d0b6-2d72-4aa5-9a71-2c0fb9fcf262\") " pod="openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.523044 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmfl4\" (UniqueName: \"kubernetes.io/projected/88f32422-f5bd-4fd8-85d1-ff2d6ccc1633-kube-api-access-nmfl4\") pod \"glance-operator-controller-manager-5847f67c56-vg9n8\" (UID: \"88f32422-f5bd-4fd8-85d1-ff2d6ccc1633\") " pod="openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.523064 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmlhb\" (UniqueName: \"kubernetes.io/projected/9f9e1543-36d5-427f-b1cc-3eb9baa9d826-kube-api-access-kmlhb\") pod \"cinder-operator-controller-manager-669b58f65-c97jv\" (UID: \"9f9e1543-36d5-427f-b1cc-3eb9baa9d826\") " pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.523095 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91f28ab0-ea37-4fef-87f2-4150127c276e-cert\") pod \"infra-operator-controller-manager-85d55b5858-r7v8b\" (UID: \"91f28ab0-ea37-4fef-87f2-4150127c276e\") " pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.523116 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xbbx\" (UniqueName: \"kubernetes.io/projected/91f28ab0-ea37-4fef-87f2-4150127c276e-kube-api-access-6xbbx\") pod \"infra-operator-controller-manager-85d55b5858-r7v8b\" (UID: \"91f28ab0-ea37-4fef-87f2-4150127c276e\") " pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.523148 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjndc\" (UniqueName: \"kubernetes.io/projected/e573b9b7-1b6c-40d6-93e0-c9103105034d-kube-api-access-cjndc\") pod \"horizon-operator-controller-manager-6985cf78fb-zpwbb\" (UID: \"e573b9b7-1b6c-40d6-93e0-c9103105034d\") " pod="openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.523179 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxkwh\" (UniqueName: \"kubernetes.io/projected/488f9339-bc6f-419a-acdc-c4601f5f0d04-kube-api-access-qxkwh\") pod \"heat-operator-controller-manager-7b45cd6d68-dsxzg\" (UID: \"488f9339-bc6f-419a-acdc-c4601f5f0d04\") " pod="openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.523219 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx47x\" (UniqueName: \"kubernetes.io/projected/070970b1-bb19-4aa4-b544-241064874029-kube-api-access-xx47x\") pod \"barbican-operator-controller-manager-bb565c8dd-qt4th\" (UID: \"070970b1-bb19-4aa4-b544-241064874029\") " pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.537518 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-2tqdx" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.544075 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.554642 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.564408 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.565622 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.570523 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6g4lv" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.587234 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.598834 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.603047 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.606815 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-j5rtv" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.609729 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.613241 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.624217 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd6qj\" (UniqueName: \"kubernetes.io/projected/c52870d2-d447-44d4-b68c-420d695b65a0-kube-api-access-nd6qj\") pod \"ironic-operator-controller-manager-54fd9dc4b5-bqsrj\" (UID: \"c52870d2-d447-44d4-b68c-420d695b65a0\") " pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.624273 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmfl4\" (UniqueName: \"kubernetes.io/projected/88f32422-f5bd-4fd8-85d1-ff2d6ccc1633-kube-api-access-nmfl4\") pod \"glance-operator-controller-manager-5847f67c56-vg9n8\" (UID: \"88f32422-f5bd-4fd8-85d1-ff2d6ccc1633\") " pod="openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.624295 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmlhb\" (UniqueName: \"kubernetes.io/projected/9f9e1543-36d5-427f-b1cc-3eb9baa9d826-kube-api-access-kmlhb\") pod \"cinder-operator-controller-manager-669b58f65-c97jv\" (UID: \"9f9e1543-36d5-427f-b1cc-3eb9baa9d826\") " pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.624324 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91f28ab0-ea37-4fef-87f2-4150127c276e-cert\") pod \"infra-operator-controller-manager-85d55b5858-r7v8b\" (UID: \"91f28ab0-ea37-4fef-87f2-4150127c276e\") " pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.624343 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xbbx\" (UniqueName: \"kubernetes.io/projected/91f28ab0-ea37-4fef-87f2-4150127c276e-kube-api-access-6xbbx\") pod \"infra-operator-controller-manager-85d55b5858-r7v8b\" (UID: \"91f28ab0-ea37-4fef-87f2-4150127c276e\") " pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.624367 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkc8k\" (UniqueName: \"kubernetes.io/projected/9d2c6c2d-8ff4-416a-9cc6-447d855fd954-kube-api-access-nkc8k\") pod \"keystone-operator-controller-manager-7f764db9b-kxkjz\" (UID: \"9d2c6c2d-8ff4-416a-9cc6-447d855fd954\") " pod="openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.624385 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjndc\" (UniqueName: \"kubernetes.io/projected/e573b9b7-1b6c-40d6-93e0-c9103105034d-kube-api-access-cjndc\") pod \"horizon-operator-controller-manager-6985cf78fb-zpwbb\" (UID: \"e573b9b7-1b6c-40d6-93e0-c9103105034d\") " pod="openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.624408 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxkwh\" (UniqueName: \"kubernetes.io/projected/488f9339-bc6f-419a-acdc-c4601f5f0d04-kube-api-access-qxkwh\") pod \"heat-operator-controller-manager-7b45cd6d68-dsxzg\" (UID: \"488f9339-bc6f-419a-acdc-c4601f5f0d04\") " pod="openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.624443 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx47x\" (UniqueName: \"kubernetes.io/projected/070970b1-bb19-4aa4-b544-241064874029-kube-api-access-xx47x\") pod \"barbican-operator-controller-manager-bb565c8dd-qt4th\" (UID: \"070970b1-bb19-4aa4-b544-241064874029\") " pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.624475 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzpgq\" (UniqueName: \"kubernetes.io/projected/02e2d0b6-2d72-4aa5-9a71-2c0fb9fcf262-kube-api-access-jzpgq\") pod \"designate-operator-controller-manager-69977bdf55-4tl5z\" (UID: \"02e2d0b6-2d72-4aa5-9a71-2c0fb9fcf262\") " pod="openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z" Dec 15 14:11:24 crc kubenswrapper[4794]: E1215 14:11:24.625063 4794 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 15 14:11:24 crc kubenswrapper[4794]: E1215 14:11:24.625108 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f28ab0-ea37-4fef-87f2-4150127c276e-cert podName:91f28ab0-ea37-4fef-87f2-4150127c276e nodeName:}" failed. No retries permitted until 2025-12-15 14:11:25.125093073 +0000 UTC m=+1046.977115511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91f28ab0-ea37-4fef-87f2-4150127c276e-cert") pod "infra-operator-controller-manager-85d55b5858-r7v8b" (UID: "91f28ab0-ea37-4fef-87f2-4150127c276e") : secret "infra-operator-webhook-server-cert" not found Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.632525 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ghvhp" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.654430 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.660640 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmlhb\" (UniqueName: \"kubernetes.io/projected/9f9e1543-36d5-427f-b1cc-3eb9baa9d826-kube-api-access-kmlhb\") pod \"cinder-operator-controller-manager-669b58f65-c97jv\" (UID: \"9f9e1543-36d5-427f-b1cc-3eb9baa9d826\") " pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.660723 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmfl4\" (UniqueName: \"kubernetes.io/projected/88f32422-f5bd-4fd8-85d1-ff2d6ccc1633-kube-api-access-nmfl4\") pod \"glance-operator-controller-manager-5847f67c56-vg9n8\" (UID: \"88f32422-f5bd-4fd8-85d1-ff2d6ccc1633\") " pod="openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.661317 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjndc\" (UniqueName: \"kubernetes.io/projected/e573b9b7-1b6c-40d6-93e0-c9103105034d-kube-api-access-cjndc\") pod \"horizon-operator-controller-manager-6985cf78fb-zpwbb\" (UID: \"e573b9b7-1b6c-40d6-93e0-c9103105034d\") " pod="openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.661376 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xbbx\" (UniqueName: \"kubernetes.io/projected/91f28ab0-ea37-4fef-87f2-4150127c276e-kube-api-access-6xbbx\") pod \"infra-operator-controller-manager-85d55b5858-r7v8b\" (UID: \"91f28ab0-ea37-4fef-87f2-4150127c276e\") " pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.668843 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzpgq\" (UniqueName: \"kubernetes.io/projected/02e2d0b6-2d72-4aa5-9a71-2c0fb9fcf262-kube-api-access-jzpgq\") pod \"designate-operator-controller-manager-69977bdf55-4tl5z\" (UID: \"02e2d0b6-2d72-4aa5-9a71-2c0fb9fcf262\") " pod="openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.672393 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.677925 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-58879495c-l2dll"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.679391 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-58879495c-l2dll" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.680559 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.683248 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-tc9fm" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.684026 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxkwh\" (UniqueName: \"kubernetes.io/projected/488f9339-bc6f-419a-acdc-c4601f5f0d04-kube-api-access-qxkwh\") pod \"heat-operator-controller-manager-7b45cd6d68-dsxzg\" (UID: \"488f9339-bc6f-419a-acdc-c4601f5f0d04\") " pod="openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.688472 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.689124 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx47x\" (UniqueName: \"kubernetes.io/projected/070970b1-bb19-4aa4-b544-241064874029-kube-api-access-xx47x\") pod \"barbican-operator-controller-manager-bb565c8dd-qt4th\" (UID: \"070970b1-bb19-4aa4-b544-241064874029\") " pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.689807 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.698075 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.700097 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.703382 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-xrxgj" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.706527 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.707553 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.710086 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rc4ph" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.711994 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.716618 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-58879495c-l2dll"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.723024 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.725156 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2pxh\" (UniqueName: \"kubernetes.io/projected/d94816b6-2a4c-44fa-a7c0-811c18ec190d-kube-api-access-z2pxh\") pod \"manila-operator-controller-manager-7cc599445b-p7sck\" (UID: \"d94816b6-2a4c-44fa-a7c0-811c18ec190d\") " pod="openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.725222 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd6qj\" (UniqueName: \"kubernetes.io/projected/c52870d2-d447-44d4-b68c-420d695b65a0-kube-api-access-nd6qj\") pod \"ironic-operator-controller-manager-54fd9dc4b5-bqsrj\" (UID: \"c52870d2-d447-44d4-b68c-420d695b65a0\") " pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.725274 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l2zg\" (UniqueName: \"kubernetes.io/projected/c7e6e262-54be-48b7-8e26-098358cab436-kube-api-access-8l2zg\") pod \"mariadb-operator-controller-manager-64d7c556cd-nq78b\" (UID: \"c7e6e262-54be-48b7-8e26-098358cab436\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.725337 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgjqf\" (UniqueName: \"kubernetes.io/projected/d3e0c4d4-3eed-4e28-b8a6-4ee1a998cdcd-kube-api-access-fgjqf\") pod \"neutron-operator-controller-manager-58879495c-l2dll\" (UID: \"d3e0c4d4-3eed-4e28-b8a6-4ee1a998cdcd\") " pod="openstack-operators/neutron-operator-controller-manager-58879495c-l2dll" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.725365 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkc8k\" (UniqueName: \"kubernetes.io/projected/9d2c6c2d-8ff4-416a-9cc6-447d855fd954-kube-api-access-nkc8k\") pod \"keystone-operator-controller-manager-7f764db9b-kxkjz\" (UID: \"9d2c6c2d-8ff4-416a-9cc6-447d855fd954\") " pod="openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.732511 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.733556 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.737277 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-s25th" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.743725 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkc8k\" (UniqueName: \"kubernetes.io/projected/9d2c6c2d-8ff4-416a-9cc6-447d855fd954-kube-api-access-nkc8k\") pod \"keystone-operator-controller-manager-7f764db9b-kxkjz\" (UID: \"9d2c6c2d-8ff4-416a-9cc6-447d855fd954\") " pod="openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.746395 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd6qj\" (UniqueName: \"kubernetes.io/projected/c52870d2-d447-44d4-b68c-420d695b65a0-kube-api-access-nd6qj\") pod \"ironic-operator-controller-manager-54fd9dc4b5-bqsrj\" (UID: \"c52870d2-d447-44d4-b68c-420d695b65a0\") " pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.748533 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.749548 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.751039 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.752233 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2fjk8" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.756241 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.765251 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.766305 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.771243 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-lm2v4" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.779415 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.785218 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.809244 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.811382 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.814435 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5fghn" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.825604 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.826681 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jcg\" (UniqueName: \"kubernetes.io/projected/20583712-205a-4875-aef3-0052b1dc4382-kube-api-access-d6jcg\") pod \"placement-operator-controller-manager-cc776f956-tmnhc\" (UID: \"20583712-205a-4875-aef3-0052b1dc4382\") " pod="openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.826744 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmjdw\" (UniqueName: \"kubernetes.io/projected/16115cae-adf6-4065-90c9-082ab050dc96-kube-api-access-zmjdw\") pod \"ovn-operator-controller-manager-5b67cfc8fb-6bmz8\" (UID: \"16115cae-adf6-4065-90c9-082ab050dc96\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.826786 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9lcj\" (UniqueName: \"kubernetes.io/projected/1a552228-3637-48fa-b860-64f1d63d9726-kube-api-access-r9lcj\") pod \"octavia-operator-controller-manager-d5fb87cb8-269nm\" (UID: \"1a552228-3637-48fa-b860-64f1d63d9726\") " pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.826854 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2pxh\" (UniqueName: \"kubernetes.io/projected/d94816b6-2a4c-44fa-a7c0-811c18ec190d-kube-api-access-z2pxh\") pod \"manila-operator-controller-manager-7cc599445b-p7sck\" (UID: \"d94816b6-2a4c-44fa-a7c0-811c18ec190d\") " pod="openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.826928 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xmvw\" (UniqueName: \"kubernetes.io/projected/84d790b6-ef4b-449b-80eb-0bc812ed496f-kube-api-access-4xmvw\") pod \"nova-operator-controller-manager-6b444986fd-fjddz\" (UID: \"84d790b6-ef4b-449b-80eb-0bc812ed496f\") " pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.826947 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dca5b13-a635-4513-988f-48091076cff9-cert\") pod \"openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh\" (UID: \"0dca5b13-a635-4513-988f-48091076cff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.827002 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l2zg\" (UniqueName: \"kubernetes.io/projected/c7e6e262-54be-48b7-8e26-098358cab436-kube-api-access-8l2zg\") pod \"mariadb-operator-controller-manager-64d7c556cd-nq78b\" (UID: \"c7e6e262-54be-48b7-8e26-098358cab436\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.827072 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgjqf\" (UniqueName: \"kubernetes.io/projected/d3e0c4d4-3eed-4e28-b8a6-4ee1a998cdcd-kube-api-access-fgjqf\") pod \"neutron-operator-controller-manager-58879495c-l2dll\" (UID: \"d3e0c4d4-3eed-4e28-b8a6-4ee1a998cdcd\") " pod="openstack-operators/neutron-operator-controller-manager-58879495c-l2dll" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.827094 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqrhr\" (UniqueName: \"kubernetes.io/projected/0dca5b13-a635-4513-988f-48091076cff9-kube-api-access-wqrhr\") pod \"openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh\" (UID: \"0dca5b13-a635-4513-988f-48091076cff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.836350 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.843842 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.851888 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.853270 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.856147 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgjqf\" (UniqueName: \"kubernetes.io/projected/d3e0c4d4-3eed-4e28-b8a6-4ee1a998cdcd-kube-api-access-fgjqf\") pod \"neutron-operator-controller-manager-58879495c-l2dll\" (UID: \"d3e0c4d4-3eed-4e28-b8a6-4ee1a998cdcd\") " pod="openstack-operators/neutron-operator-controller-manager-58879495c-l2dll" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.856366 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-htmbj" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.856802 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2pxh\" (UniqueName: \"kubernetes.io/projected/d94816b6-2a4c-44fa-a7c0-811c18ec190d-kube-api-access-z2pxh\") pod \"manila-operator-controller-manager-7cc599445b-p7sck\" (UID: \"d94816b6-2a4c-44fa-a7c0-811c18ec190d\") " pod="openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.881027 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l2zg\" (UniqueName: \"kubernetes.io/projected/c7e6e262-54be-48b7-8e26-098358cab436-kube-api-access-8l2zg\") pod \"mariadb-operator-controller-manager-64d7c556cd-nq78b\" (UID: \"c7e6e262-54be-48b7-8e26-098358cab436\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.916806 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw"] Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.924103 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.928957 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.930655 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqrhr\" (UniqueName: \"kubernetes.io/projected/0dca5b13-a635-4513-988f-48091076cff9-kube-api-access-wqrhr\") pod \"openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh\" (UID: \"0dca5b13-a635-4513-988f-48091076cff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.930716 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjnwv\" (UniqueName: \"kubernetes.io/projected/b6e33671-c04d-4fc0-825d-13355e317733-kube-api-access-vjnwv\") pod \"swift-operator-controller-manager-7c9ff8845d-tgfnx\" (UID: \"b6e33671-c04d-4fc0-825d-13355e317733\") " pod="openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.930755 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jcg\" (UniqueName: \"kubernetes.io/projected/20583712-205a-4875-aef3-0052b1dc4382-kube-api-access-d6jcg\") pod \"placement-operator-controller-manager-cc776f956-tmnhc\" (UID: \"20583712-205a-4875-aef3-0052b1dc4382\") " pod="openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.930792 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmjdw\" (UniqueName: \"kubernetes.io/projected/16115cae-adf6-4065-90c9-082ab050dc96-kube-api-access-zmjdw\") pod \"ovn-operator-controller-manager-5b67cfc8fb-6bmz8\" (UID: \"16115cae-adf6-4065-90c9-082ab050dc96\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.930839 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9lcj\" (UniqueName: \"kubernetes.io/projected/1a552228-3637-48fa-b860-64f1d63d9726-kube-api-access-r9lcj\") pod \"octavia-operator-controller-manager-d5fb87cb8-269nm\" (UID: \"1a552228-3637-48fa-b860-64f1d63d9726\") " pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.930899 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srvr7\" (UniqueName: \"kubernetes.io/projected/5e547ae0-d6e7-4dd7-b6c1-731554f36f8d-kube-api-access-srvr7\") pod \"telemetry-operator-controller-manager-6bc5b9c47-drrxw\" (UID: \"5e547ae0-d6e7-4dd7-b6c1-731554f36f8d\") " pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.930954 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xmvw\" (UniqueName: \"kubernetes.io/projected/84d790b6-ef4b-449b-80eb-0bc812ed496f-kube-api-access-4xmvw\") pod \"nova-operator-controller-manager-6b444986fd-fjddz\" (UID: \"84d790b6-ef4b-449b-80eb-0bc812ed496f\") " pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.930983 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dca5b13-a635-4513-988f-48091076cff9-cert\") pod \"openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh\" (UID: \"0dca5b13-a635-4513-988f-48091076cff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" Dec 15 14:11:24 crc kubenswrapper[4794]: E1215 14:11:24.931201 4794 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 15 14:11:24 crc kubenswrapper[4794]: E1215 14:11:24.931270 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dca5b13-a635-4513-988f-48091076cff9-cert podName:0dca5b13-a635-4513-988f-48091076cff9 nodeName:}" failed. No retries permitted until 2025-12-15 14:11:25.431251241 +0000 UTC m=+1047.283273679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dca5b13-a635-4513-988f-48091076cff9-cert") pod "openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" (UID: "0dca5b13-a635-4513-988f-48091076cff9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.941738 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.941869 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.963532 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.963681 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqrhr\" (UniqueName: \"kubernetes.io/projected/0dca5b13-a635-4513-988f-48091076cff9-kube-api-access-wqrhr\") pod \"openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh\" (UID: \"0dca5b13-a635-4513-988f-48091076cff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.963824 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9lcj\" (UniqueName: \"kubernetes.io/projected/1a552228-3637-48fa-b860-64f1d63d9726-kube-api-access-r9lcj\") pod \"octavia-operator-controller-manager-d5fb87cb8-269nm\" (UID: \"1a552228-3637-48fa-b860-64f1d63d9726\") " pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.964575 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmjdw\" (UniqueName: \"kubernetes.io/projected/16115cae-adf6-4065-90c9-082ab050dc96-kube-api-access-zmjdw\") pod \"ovn-operator-controller-manager-5b67cfc8fb-6bmz8\" (UID: \"16115cae-adf6-4065-90c9-082ab050dc96\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.967348 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xmvw\" (UniqueName: \"kubernetes.io/projected/84d790b6-ef4b-449b-80eb-0bc812ed496f-kube-api-access-4xmvw\") pod \"nova-operator-controller-manager-6b444986fd-fjddz\" (UID: \"84d790b6-ef4b-449b-80eb-0bc812ed496f\") " pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" Dec 15 14:11:24 crc kubenswrapper[4794]: I1215 14:11:24.968489 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jcg\" (UniqueName: \"kubernetes.io/projected/20583712-205a-4875-aef3-0052b1dc4382-kube-api-access-d6jcg\") pod \"placement-operator-controller-manager-cc776f956-tmnhc\" (UID: \"20583712-205a-4875-aef3-0052b1dc4382\") " pod="openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.040300 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srvr7\" (UniqueName: \"kubernetes.io/projected/5e547ae0-d6e7-4dd7-b6c1-731554f36f8d-kube-api-access-srvr7\") pod \"telemetry-operator-controller-manager-6bc5b9c47-drrxw\" (UID: \"5e547ae0-d6e7-4dd7-b6c1-731554f36f8d\") " pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.040393 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjnwv\" (UniqueName: \"kubernetes.io/projected/b6e33671-c04d-4fc0-825d-13355e317733-kube-api-access-vjnwv\") pod \"swift-operator-controller-manager-7c9ff8845d-tgfnx\" (UID: \"b6e33671-c04d-4fc0-825d-13355e317733\") " pod="openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.058659 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.059897 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.064373 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8nsnc" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.068404 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjnwv\" (UniqueName: \"kubernetes.io/projected/b6e33671-c04d-4fc0-825d-13355e317733-kube-api-access-vjnwv\") pod \"swift-operator-controller-manager-7c9ff8845d-tgfnx\" (UID: \"b6e33671-c04d-4fc0-825d-13355e317733\") " pod="openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.078685 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.084306 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srvr7\" (UniqueName: \"kubernetes.io/projected/5e547ae0-d6e7-4dd7-b6c1-731554f36f8d-kube-api-access-srvr7\") pod \"telemetry-operator-controller-manager-6bc5b9c47-drrxw\" (UID: \"5e547ae0-d6e7-4dd7-b6c1-731554f36f8d\") " pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.091942 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-58879495c-l2dll" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.101261 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.104594 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.106154 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.107935 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-25hzt" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.116453 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.120677 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.131111 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.137773 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.138956 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.141690 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.143334 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91f28ab0-ea37-4fef-87f2-4150127c276e-cert\") pod \"infra-operator-controller-manager-85d55b5858-r7v8b\" (UID: \"91f28ab0-ea37-4fef-87f2-4150127c276e\") " pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.143390 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvf56\" (UniqueName: \"kubernetes.io/projected/9bd54690-941b-45f7-b42d-a9b5f8ebc065-kube-api-access-lvf56\") pod \"watcher-operator-controller-manager-7687976674-p8vpp\" (UID: \"9bd54690-941b-45f7-b42d-a9b5f8ebc065\") " pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.143429 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lcnz\" (UniqueName: \"kubernetes.io/projected/d36dacd3-9670-4927-a93f-cbc50b901ef5-kube-api-access-2lcnz\") pod \"test-operator-controller-manager-5d79c6465c-mp2hm\" (UID: \"d36dacd3-9670-4927-a93f-cbc50b901ef5\") " pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.141305 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-db69g" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.147351 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91f28ab0-ea37-4fef-87f2-4150127c276e-cert\") pod \"infra-operator-controller-manager-85d55b5858-r7v8b\" (UID: \"91f28ab0-ea37-4fef-87f2-4150127c276e\") " pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.148612 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.185106 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.186378 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.189561 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gmqzh" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.205152 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.206713 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.245541 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gw6x\" (UniqueName: \"kubernetes.io/projected/5c962533-9c6b-459d-92b7-768ca6a8b110-kube-api-access-4gw6x\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx\" (UID: \"5c962533-9c6b-459d-92b7-768ca6a8b110\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.245851 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvf56\" (UniqueName: \"kubernetes.io/projected/9bd54690-941b-45f7-b42d-a9b5f8ebc065-kube-api-access-lvf56\") pod \"watcher-operator-controller-manager-7687976674-p8vpp\" (UID: \"9bd54690-941b-45f7-b42d-a9b5f8ebc065\") " pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.245886 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lcnz\" (UniqueName: \"kubernetes.io/projected/d36dacd3-9670-4927-a93f-cbc50b901ef5-kube-api-access-2lcnz\") pod \"test-operator-controller-manager-5d79c6465c-mp2hm\" (UID: \"d36dacd3-9670-4927-a93f-cbc50b901ef5\") " pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.245945 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c097f1d-1a33-480b-945a-5aa6c4e605c3-cert\") pod \"openstack-operator-controller-manager-cfcd4798-sk7w5\" (UID: \"5c097f1d-1a33-480b-945a-5aa6c4e605c3\") " pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.245962 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk6bd\" (UniqueName: \"kubernetes.io/projected/5c097f1d-1a33-480b-945a-5aa6c4e605c3-kube-api-access-tk6bd\") pod \"openstack-operator-controller-manager-cfcd4798-sk7w5\" (UID: \"5c097f1d-1a33-480b-945a-5aa6c4e605c3\") " pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.249064 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.263445 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.269112 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lcnz\" (UniqueName: \"kubernetes.io/projected/d36dacd3-9670-4927-a93f-cbc50b901ef5-kube-api-access-2lcnz\") pod \"test-operator-controller-manager-5d79c6465c-mp2hm\" (UID: \"d36dacd3-9670-4927-a93f-cbc50b901ef5\") " pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.284601 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvf56\" (UniqueName: \"kubernetes.io/projected/9bd54690-941b-45f7-b42d-a9b5f8ebc065-kube-api-access-lvf56\") pod \"watcher-operator-controller-manager-7687976674-p8vpp\" (UID: \"9bd54690-941b-45f7-b42d-a9b5f8ebc065\") " pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.291282 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.294189 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.351890 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c097f1d-1a33-480b-945a-5aa6c4e605c3-cert\") pod \"openstack-operator-controller-manager-cfcd4798-sk7w5\" (UID: \"5c097f1d-1a33-480b-945a-5aa6c4e605c3\") " pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.351929 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6bd\" (UniqueName: \"kubernetes.io/projected/5c097f1d-1a33-480b-945a-5aa6c4e605c3-kube-api-access-tk6bd\") pod \"openstack-operator-controller-manager-cfcd4798-sk7w5\" (UID: \"5c097f1d-1a33-480b-945a-5aa6c4e605c3\") " pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.352032 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gw6x\" (UniqueName: \"kubernetes.io/projected/5c962533-9c6b-459d-92b7-768ca6a8b110-kube-api-access-4gw6x\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx\" (UID: \"5c962533-9c6b-459d-92b7-768ca6a8b110\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx" Dec 15 14:11:25 crc kubenswrapper[4794]: E1215 14:11:25.353040 4794 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 15 14:11:25 crc kubenswrapper[4794]: E1215 14:11:25.353123 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c097f1d-1a33-480b-945a-5aa6c4e605c3-cert podName:5c097f1d-1a33-480b-945a-5aa6c4e605c3 nodeName:}" failed. No retries permitted until 2025-12-15 14:11:25.853100417 +0000 UTC m=+1047.705122855 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c097f1d-1a33-480b-945a-5aa6c4e605c3-cert") pod "openstack-operator-controller-manager-cfcd4798-sk7w5" (UID: "5c097f1d-1a33-480b-945a-5aa6c4e605c3") : secret "webhook-server-cert" not found Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.365979 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.377788 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk6bd\" (UniqueName: \"kubernetes.io/projected/5c097f1d-1a33-480b-945a-5aa6c4e605c3-kube-api-access-tk6bd\") pod \"openstack-operator-controller-manager-cfcd4798-sk7w5\" (UID: \"5c097f1d-1a33-480b-945a-5aa6c4e605c3\") " pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.389039 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gw6x\" (UniqueName: \"kubernetes.io/projected/5c962533-9c6b-459d-92b7-768ca6a8b110-kube-api-access-4gw6x\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx\" (UID: \"5c962533-9c6b-459d-92b7-768ca6a8b110\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.430899 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.440298 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.452993 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dca5b13-a635-4513-988f-48091076cff9-cert\") pod \"openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh\" (UID: \"0dca5b13-a635-4513-988f-48091076cff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.457744 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dca5b13-a635-4513-988f-48091076cff9-cert\") pod \"openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh\" (UID: \"0dca5b13-a635-4513-988f-48091076cff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.458285 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.475800 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.512060 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.677002 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.682753 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj"] Dec 15 14:11:25 crc kubenswrapper[4794]: W1215 14:11:25.692809 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod070970b1_bb19_4aa4_b544_241064874029.slice/crio-d08e1432c7a70312ae580123cb9ecc842d13e60d44ad6aa3b12ab85786e61421 WatchSource:0}: Error finding container d08e1432c7a70312ae580123cb9ecc842d13e60d44ad6aa3b12ab85786e61421: Status 404 returned error can't find the container with id d08e1432c7a70312ae580123cb9ecc842d13e60d44ad6aa3b12ab85786e61421 Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.756283 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.858636 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c097f1d-1a33-480b-945a-5aa6c4e605c3-cert\") pod \"openstack-operator-controller-manager-cfcd4798-sk7w5\" (UID: \"5c097f1d-1a33-480b-945a-5aa6c4e605c3\") " pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.867242 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c097f1d-1a33-480b-945a-5aa6c4e605c3-cert\") pod \"openstack-operator-controller-manager-cfcd4798-sk7w5\" (UID: \"5c097f1d-1a33-480b-945a-5aa6c4e605c3\") " pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.872981 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8" event={"ID":"88f32422-f5bd-4fd8-85d1-ff2d6ccc1633","Type":"ContainerStarted","Data":"1578a7f1811f5de0be97056f099e26f9b2950ef99e63aa24e22b2392af7248b2"} Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.874171 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z" event={"ID":"02e2d0b6-2d72-4aa5-9a71-2c0fb9fcf262","Type":"ContainerStarted","Data":"ce706b212950979caa72a97ac7db7c9cd244f7abde348d9b6fd9bdc1ef8fab03"} Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.874770 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" event={"ID":"c52870d2-d447-44d4-b68c-420d695b65a0","Type":"ContainerStarted","Data":"1b5cfc9a1a8c07cb198690cc68228e58d9543a8e7428e2fd287c7101c2eac195"} Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.875320 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg" event={"ID":"488f9339-bc6f-419a-acdc-c4601f5f0d04","Type":"ContainerStarted","Data":"4e13d3f573c83790eeaee6adfd4e692fdb1871dd68a6e21da8153fe144fd6e93"} Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.877936 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb" event={"ID":"e573b9b7-1b6c-40d6-93e0-c9103105034d","Type":"ContainerStarted","Data":"333b5b792772f85513379dd1c9685463e607d63095e15a9a1e1004e3fb0be431"} Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.891435 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" event={"ID":"070970b1-bb19-4aa4-b544-241064874029","Type":"ContainerStarted","Data":"d08e1432c7a70312ae580123cb9ecc842d13e60d44ad6aa3b12ab85786e61421"} Dec 15 14:11:25 crc kubenswrapper[4794]: W1215 14:11:25.965846 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d2c6c2d_8ff4_416a_9cc6_447d855fd954.slice/crio-7dc3236fbb82a097f14e6049993eba3410162ddd42960422bfbca00111019591 WatchSource:0}: Error finding container 7dc3236fbb82a097f14e6049993eba3410162ddd42960422bfbca00111019591: Status 404 returned error can't find the container with id 7dc3236fbb82a097f14e6049993eba3410162ddd42960422bfbca00111019591 Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.971637 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.973221 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz"] Dec 15 14:11:25 crc kubenswrapper[4794]: I1215 14:11:25.985623 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck"] Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.071119 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.093029 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx"] Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.129891 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm"] Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.150787 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-58879495c-l2dll"] Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.161671 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b"] Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.167983 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8"] Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.170048 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv"] Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.173964 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc"] Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.185160 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:2c4fe20e044dd8ea1f60f2f3f5e3844d932b4b79439835bd8771c73f16b38312,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8l2zg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-64d7c556cd-nq78b_openstack-operators(c7e6e262-54be-48b7-8e26-098358cab436): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.185215 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:d6a3d956e8dada1d7da372b532f955e6310002527667e24b08220c65956110af,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kmlhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-669b58f65-c97jv_openstack-operators(9f9e1543-36d5-427f-b1cc-3eb9baa9d826): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.226244 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx"] Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.250158 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw"] Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.250354 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4gw6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx_openstack-operators(5c962533-9c6b-459d-92b7-768ca6a8b110): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.255532 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx" podUID="5c962533-9c6b-459d-92b7-768ca6a8b110" Dec 15 14:11:26 crc kubenswrapper[4794]: W1215 14:11:26.272833 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e547ae0_d6e7_4dd7_b6c1_731554f36f8d.slice/crio-bb7b801bc30a08870f072b82f9b2b58a2003fac173ba4fe243418a900fc84075 WatchSource:0}: Error finding container bb7b801bc30a08870f072b82f9b2b58a2003fac173ba4fe243418a900fc84075: Status 404 returned error can't find the container with id bb7b801bc30a08870f072b82f9b2b58a2003fac173ba4fe243418a900fc84075 Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.277449 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:13002ade744a50b84f3e9e793e68a3998be0d90fe877520fbd60257309931d7d,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-srvr7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6bc5b9c47-drrxw_openstack-operators(5e547ae0-d6e7-4dd7-b6c1-731554f36f8d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.346413 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" podUID="c7e6e262-54be-48b7-8e26-098358cab436" Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.365735 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" podUID="9f9e1543-36d5-427f-b1cc-3eb9baa9d826" Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.400559 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm"] Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.409882 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b"] Dec 15 14:11:26 crc kubenswrapper[4794]: W1215 14:11:26.412402 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91f28ab0_ea37_4fef_87f2_4150127c276e.slice/crio-3d18f03ee413115d503726ae4dbc054e32526c6eae36b2ddab00d54b16d91ed5 WatchSource:0}: Error finding container 3d18f03ee413115d503726ae4dbc054e32526c6eae36b2ddab00d54b16d91ed5: Status 404 returned error can't find the container with id 3d18f03ee413115d503726ae4dbc054e32526c6eae36b2ddab00d54b16d91ed5 Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.421383 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp"] Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.429840 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.145:5001/openstack-k8s-operators/watcher-operator:3c1649d7f41bde875c3f37083dd1dffeb8d37484,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lvf56,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7687976674-p8vpp_openstack-operators(9bd54690-941b-45f7-b42d-a9b5f8ebc065): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.442467 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:b9e09dbcf7f70960e90ecbb8b31bbb7acf141fc4975f69e37482df2bd0ea2773,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2lcnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5d79c6465c-mp2hm_openstack-operators(d36dacd3-9670-4927-a93f-cbc50b901ef5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.450915 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" podUID="5e547ae0-d6e7-4dd7-b6c1-731554f36f8d" Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.460411 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh"] Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.485478 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bfb1e0635f87094bee949f00fea37cbc27b88c42a7cef1909e0b68e5abd185c7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wqrhr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh_openstack-operators(0dca5b13-a635-4513-988f-48091076cff9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.583287 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5"] Dec 15 14:11:26 crc kubenswrapper[4794]: W1215 14:11:26.615775 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c097f1d_1a33_480b_945a_5aa6c4e605c3.slice/crio-8594bfbc11a0f32c57a4822f40c248f7d3bce7665039d22c1bbdb4cd04896ff8 WatchSource:0}: Error finding container 8594bfbc11a0f32c57a4822f40c248f7d3bce7665039d22c1bbdb4cd04896ff8: Status 404 returned error can't find the container with id 8594bfbc11a0f32c57a4822f40c248f7d3bce7665039d22c1bbdb4cd04896ff8 Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.627017 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" podUID="d36dacd3-9670-4927-a93f-cbc50b901ef5" Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.689439 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" podUID="0dca5b13-a635-4513-988f-48091076cff9" Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.710571 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.956983 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx" event={"ID":"5c962533-9c6b-459d-92b7-768ca6a8b110","Type":"ContainerStarted","Data":"46c8787a8a6cb6832762b1d66da92e0ddc963eac62e98840f40d3a937ea9970e"} Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.962988 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" event={"ID":"5e547ae0-d6e7-4dd7-b6c1-731554f36f8d","Type":"ContainerStarted","Data":"d46bf652a65df0832d7a622b56404602ccd63cdcd44836d3955788cae17e05d8"} Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.963033 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" event={"ID":"5e547ae0-d6e7-4dd7-b6c1-731554f36f8d","Type":"ContainerStarted","Data":"bb7b801bc30a08870f072b82f9b2b58a2003fac173ba4fe243418a900fc84075"} Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.987305 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" event={"ID":"c7e6e262-54be-48b7-8e26-098358cab436","Type":"ContainerStarted","Data":"1b33bc7f00c3b3222eb6bb96eed761113232b62157f23967db9560fed3f393c5"} Dec 15 14:11:26 crc kubenswrapper[4794]: I1215 14:11:26.987349 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" event={"ID":"c7e6e262-54be-48b7-8e26-098358cab436","Type":"ContainerStarted","Data":"212c689658b07d4c27713f7a91a764e7155be7fb870b14764be7f538d0207663"} Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.988491 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx" podUID="5c962533-9c6b-459d-92b7-768ca6a8b110" Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.988570 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:13002ade744a50b84f3e9e793e68a3998be0d90fe877520fbd60257309931d7d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" podUID="5e547ae0-d6e7-4dd7-b6c1-731554f36f8d" Dec 15 14:11:26 crc kubenswrapper[4794]: E1215 14:11:26.989564 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2c4fe20e044dd8ea1f60f2f3f5e3844d932b4b79439835bd8771c73f16b38312\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" podUID="c7e6e262-54be-48b7-8e26-098358cab436" Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.004377 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" event={"ID":"d36dacd3-9670-4927-a93f-cbc50b901ef5","Type":"ContainerStarted","Data":"4805b7b25a901bcbf277f6b6b14c49a14cbc9380b395e1f2fdf1f1cb869d5a3f"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.004445 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" event={"ID":"d36dacd3-9670-4927-a93f-cbc50b901ef5","Type":"ContainerStarted","Data":"df949923e038a6824bf75cbe7392d89603b179e427454a267318acc9f7ecdaaa"} Dec 15 14:11:27 crc kubenswrapper[4794]: E1215 14:11:27.008174 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:b9e09dbcf7f70960e90ecbb8b31bbb7acf141fc4975f69e37482df2bd0ea2773\\\"\"" pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" podUID="d36dacd3-9670-4927-a93f-cbc50b901ef5" Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.019166 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz" event={"ID":"9d2c6c2d-8ff4-416a-9cc6-447d855fd954","Type":"ContainerStarted","Data":"7dc3236fbb82a097f14e6049993eba3410162ddd42960422bfbca00111019591"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.058752 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" event={"ID":"5c097f1d-1a33-480b-945a-5aa6c4e605c3","Type":"ContainerStarted","Data":"af569411f0b273e81747b9ea927a034b6eb169aaef79e824c670d6ecc31e0932"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.058790 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" event={"ID":"5c097f1d-1a33-480b-945a-5aa6c4e605c3","Type":"ContainerStarted","Data":"8594bfbc11a0f32c57a4822f40c248f7d3bce7665039d22c1bbdb4cd04896ff8"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.081428 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" event={"ID":"84d790b6-ef4b-449b-80eb-0bc812ed496f","Type":"ContainerStarted","Data":"ecdf042070d34e018f206a3f713d54b86c24a0725b45c2515ad590caf6da71a9"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.082737 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-58879495c-l2dll" event={"ID":"d3e0c4d4-3eed-4e28-b8a6-4ee1a998cdcd","Type":"ContainerStarted","Data":"c9fe313083c4964185268d496dbc5fcf3ab4648fb44d74c54300d527e9f36d60"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.109088 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" event={"ID":"91f28ab0-ea37-4fef-87f2-4150127c276e","Type":"ContainerStarted","Data":"3d18f03ee413115d503726ae4dbc054e32526c6eae36b2ddab00d54b16d91ed5"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.110929 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" event={"ID":"0dca5b13-a635-4513-988f-48091076cff9","Type":"ContainerStarted","Data":"4520132e45057ff2a14d0bc3c81743b6b6a9dfa61ef91f679764e659a66ed48f"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.110971 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" event={"ID":"0dca5b13-a635-4513-988f-48091076cff9","Type":"ContainerStarted","Data":"20f241f1df010e5f99cf90f4e3e9d1485610237ac3edb6ac9241fff2a354f99d"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.112994 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8" event={"ID":"16115cae-adf6-4065-90c9-082ab050dc96","Type":"ContainerStarted","Data":"ec730105dc614b4b04a84228fa8a2bb04bafa097341cdc0c5c5f1d44a18c207e"} Dec 15 14:11:27 crc kubenswrapper[4794]: E1215 14:11:27.113002 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bfb1e0635f87094bee949f00fea37cbc27b88c42a7cef1909e0b68e5abd185c7\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" podUID="0dca5b13-a635-4513-988f-48091076cff9" Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.115150 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx" event={"ID":"b6e33671-c04d-4fc0-825d-13355e317733","Type":"ContainerStarted","Data":"0d3ec837fede906275897850c41e65bfd87224e6e030a642235416cf1f13591e"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.118144 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" event={"ID":"9bd54690-941b-45f7-b42d-a9b5f8ebc065","Type":"ContainerStarted","Data":"b9eb37e2979547d7db8c0ff0e323f330da1893b513910b75ce6c4e42f532605b"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.118169 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" event={"ID":"9bd54690-941b-45f7-b42d-a9b5f8ebc065","Type":"ContainerStarted","Data":"838cb71dfea876f11b30bb2834d45ba66a2fe7081d67d72513e715641c38a8a4"} Dec 15 14:11:27 crc kubenswrapper[4794]: E1215 14:11:27.119777 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.145:5001/openstack-k8s-operators/watcher-operator:3c1649d7f41bde875c3f37083dd1dffeb8d37484\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.119789 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm" event={"ID":"1a552228-3637-48fa-b860-64f1d63d9726","Type":"ContainerStarted","Data":"4516615dbdc9b4c78fbcb041ba6f6b35abcc2b8d3cb7197fadfc3033df7021e0"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.121320 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc" event={"ID":"20583712-205a-4875-aef3-0052b1dc4382","Type":"ContainerStarted","Data":"b393a18bd6b754b619aab882b65f4b43a97c7ff5af3e2d671ee4e5ded705d250"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.149407 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck" event={"ID":"d94816b6-2a4c-44fa-a7c0-811c18ec190d","Type":"ContainerStarted","Data":"13f05d6f555a097823e8e3b08ad8d975300828db08717bc016410359f5bc8251"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.157940 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" event={"ID":"9f9e1543-36d5-427f-b1cc-3eb9baa9d826","Type":"ContainerStarted","Data":"50faa19651bd2759e3c97ad335efb10618bd17dc502895d841ef9354584910f2"} Dec 15 14:11:27 crc kubenswrapper[4794]: I1215 14:11:27.157984 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" event={"ID":"9f9e1543-36d5-427f-b1cc-3eb9baa9d826","Type":"ContainerStarted","Data":"0938e39f889066097d45e5085902ccf80083c907920854e9d96c72309eaf0c62"} Dec 15 14:11:27 crc kubenswrapper[4794]: E1215 14:11:27.162134 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:d6a3d956e8dada1d7da372b532f955e6310002527667e24b08220c65956110af\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" podUID="9f9e1543-36d5-427f-b1cc-3eb9baa9d826" Dec 15 14:11:28 crc kubenswrapper[4794]: I1215 14:11:28.180061 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" event={"ID":"5c097f1d-1a33-480b-945a-5aa6c4e605c3","Type":"ContainerStarted","Data":"93258f994b05f3e04b6556f42919851008961b2a935577f9598985d93abe2bd5"} Dec 15 14:11:28 crc kubenswrapper[4794]: E1215 14:11:28.182066 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx" podUID="5c962533-9c6b-459d-92b7-768ca6a8b110" Dec 15 14:11:28 crc kubenswrapper[4794]: E1215 14:11:28.182407 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:b9e09dbcf7f70960e90ecbb8b31bbb7acf141fc4975f69e37482df2bd0ea2773\\\"\"" pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" podUID="d36dacd3-9670-4927-a93f-cbc50b901ef5" Dec 15 14:11:28 crc kubenswrapper[4794]: E1215 14:11:28.182424 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bfb1e0635f87094bee949f00fea37cbc27b88c42a7cef1909e0b68e5abd185c7\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" podUID="0dca5b13-a635-4513-988f-48091076cff9" Dec 15 14:11:28 crc kubenswrapper[4794]: E1215 14:11:28.182422 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:d6a3d956e8dada1d7da372b532f955e6310002527667e24b08220c65956110af\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" podUID="9f9e1543-36d5-427f-b1cc-3eb9baa9d826" Dec 15 14:11:28 crc kubenswrapper[4794]: E1215 14:11:28.182562 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.145:5001/openstack-k8s-operators/watcher-operator:3c1649d7f41bde875c3f37083dd1dffeb8d37484\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" Dec 15 14:11:28 crc kubenswrapper[4794]: E1215 14:11:28.182837 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:13002ade744a50b84f3e9e793e68a3998be0d90fe877520fbd60257309931d7d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" podUID="5e547ae0-d6e7-4dd7-b6c1-731554f36f8d" Dec 15 14:11:28 crc kubenswrapper[4794]: E1215 14:11:28.182933 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2c4fe20e044dd8ea1f60f2f3f5e3844d932b4b79439835bd8771c73f16b38312\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" podUID="c7e6e262-54be-48b7-8e26-098358cab436" Dec 15 14:11:28 crc kubenswrapper[4794]: I1215 14:11:28.302061 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" podStartSLOduration=4.302043591 podStartE2EDuration="4.302043591s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:11:28.297163844 +0000 UTC m=+1050.149186292" watchObservedRunningTime="2025-12-15 14:11:28.302043591 +0000 UTC m=+1050.154066019" Dec 15 14:11:29 crc kubenswrapper[4794]: I1215 14:11:29.187383 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" Dec 15 14:11:36 crc kubenswrapper[4794]: I1215 14:11:36.091309 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-cfcd4798-sk7w5" Dec 15 14:11:39 crc kubenswrapper[4794]: E1215 14:11:39.584028 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:03833d5c6982b42c836787ee1863f5f73e20dce26a154171de6c0cf4712938b5" Dec 15 14:11:39 crc kubenswrapper[4794]: E1215 14:11:39.584191 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:03833d5c6982b42c836787ee1863f5f73e20dce26a154171de6c0cf4712938b5,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xx47x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-bb565c8dd-qt4th_openstack-operators(070970b1-bb19-4aa4-b544-241064874029): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 15 14:11:40 crc kubenswrapper[4794]: E1215 14:11:40.066143 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e2b7b9bdbf93b2ff7012cd2af921ae43082fd3eb036d884f13292c2e56f505c" Dec 15 14:11:40 crc kubenswrapper[4794]: E1215 14:11:40.066981 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e2b7b9bdbf93b2ff7012cd2af921ae43082fd3eb036d884f13292c2e56f505c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nd6qj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-54fd9dc4b5-bqsrj_openstack-operators(c52870d2-d447-44d4-b68c-420d695b65a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 15 14:11:40 crc kubenswrapper[4794]: E1215 14:11:40.544687 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:177bba84f71a0b2cfd00a31147aa349fe4c25c83d2b9df7563b5dd5cfeafc161" Dec 15 14:11:40 crc kubenswrapper[4794]: E1215 14:11:40.544984 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:177bba84f71a0b2cfd00a31147aa349fe4c25c83d2b9df7563b5dd5cfeafc161,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4xmvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-6b444986fd-fjddz_openstack-operators(84d790b6-ef4b-449b-80eb-0bc812ed496f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 15 14:11:40 crc kubenswrapper[4794]: I1215 14:11:40.752733 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 15 14:11:40 crc kubenswrapper[4794]: E1215 14:11:40.995298 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" podUID="c52870d2-d447-44d4-b68c-420d695b65a0" Dec 15 14:11:41 crc kubenswrapper[4794]: E1215 14:11:41.082557 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" podUID="84d790b6-ef4b-449b-80eb-0bc812ed496f" Dec 15 14:11:41 crc kubenswrapper[4794]: E1215 14:11:41.093247 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" podUID="070970b1-bb19-4aa4-b544-241064874029" Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.305339 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8" event={"ID":"16115cae-adf6-4065-90c9-082ab050dc96","Type":"ContainerStarted","Data":"49d2aa163ac83bf95fcdb912789ddda70e6dd3e7afb01cf2de7936ba5c015fa6"} Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.314859 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz" event={"ID":"9d2c6c2d-8ff4-416a-9cc6-447d855fd954","Type":"ContainerStarted","Data":"13c90c19e91c2864f980ff3320edcbadc5da156328c2fbd614a9a221dff706b1"} Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.342483 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm" event={"ID":"1a552228-3637-48fa-b860-64f1d63d9726","Type":"ContainerStarted","Data":"9714195ed893a83fbf69e12ad351c8db85809cdf1ee23c9bb27b678879f8a1da"} Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.343825 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc" event={"ID":"20583712-205a-4875-aef3-0052b1dc4382","Type":"ContainerStarted","Data":"699b0cc8d3fa55a18ac53e50129d5baca31a6ed2e00ca70a9f13927d1e898d71"} Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.355762 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg" event={"ID":"488f9339-bc6f-419a-acdc-c4601f5f0d04","Type":"ContainerStarted","Data":"195f8d84045ff51a130dc6182da2a819724d7dabfe1d0dff8be39eb005f98d02"} Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.377992 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-58879495c-l2dll" event={"ID":"d3e0c4d4-3eed-4e28-b8a6-4ee1a998cdcd","Type":"ContainerStarted","Data":"2858a003c378b313734d1289e1479dce6b1cf240603f97f4bee4061bf4a03e9b"} Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.379898 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z" event={"ID":"02e2d0b6-2d72-4aa5-9a71-2c0fb9fcf262","Type":"ContainerStarted","Data":"205fda24e4f8b04457ad07bc75a540abffecf5b7b08dee0e31d8b99c4c88168e"} Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.423029 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx" event={"ID":"b6e33671-c04d-4fc0-825d-13355e317733","Type":"ContainerStarted","Data":"4a7f35db6febadc221126bdb51d3a56d3820a7080bda32fbe3fb4937ad04c5c7"} Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.424998 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8" event={"ID":"88f32422-f5bd-4fd8-85d1-ff2d6ccc1633","Type":"ContainerStarted","Data":"9b10c50a346260a66c683e9fc785cbf951d50aef243854269b7976d37c54343e"} Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.444114 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" event={"ID":"84d790b6-ef4b-449b-80eb-0bc812ed496f","Type":"ContainerStarted","Data":"0b4decc0e21f152046542e51a5f5da5590fbabf700a5a5798fda7cfa1266b640"} Dec 15 14:11:41 crc kubenswrapper[4794]: E1215 14:11:41.453600 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:177bba84f71a0b2cfd00a31147aa349fe4c25c83d2b9df7563b5dd5cfeafc161\\\"\"" pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" podUID="84d790b6-ef4b-449b-80eb-0bc812ed496f" Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.456739 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" event={"ID":"070970b1-bb19-4aa4-b544-241064874029","Type":"ContainerStarted","Data":"d6fdf02d7992e3dfa8072edd30c45313f2ff9c18e24dd90256062cea8b04c107"} Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.457869 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" event={"ID":"c52870d2-d447-44d4-b68c-420d695b65a0","Type":"ContainerStarted","Data":"c1fb9636f0c842d3f891d7df412d2158cf05268a8b890aab3545d6447821a204"} Dec 15 14:11:41 crc kubenswrapper[4794]: E1215 14:11:41.458646 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:03833d5c6982b42c836787ee1863f5f73e20dce26a154171de6c0cf4712938b5\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" podUID="070970b1-bb19-4aa4-b544-241064874029" Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.466477 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" event={"ID":"91f28ab0-ea37-4fef-87f2-4150127c276e","Type":"ContainerStarted","Data":"bb5f6dfc3482a391be826754729a9b0edc65437766ada910d225cf91fc23d0b8"} Dec 15 14:11:41 crc kubenswrapper[4794]: E1215 14:11:41.475569 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e2b7b9bdbf93b2ff7012cd2af921ae43082fd3eb036d884f13292c2e56f505c\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" podUID="c52870d2-d447-44d4-b68c-420d695b65a0" Dec 15 14:11:41 crc kubenswrapper[4794]: I1215 14:11:41.498086 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck" event={"ID":"d94816b6-2a4c-44fa-a7c0-811c18ec190d","Type":"ContainerStarted","Data":"16b643149cee225e37c694e188a701da7b23327d328c2c2c357febae70a66b01"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.509117 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-58879495c-l2dll" event={"ID":"d3e0c4d4-3eed-4e28-b8a6-4ee1a998cdcd","Type":"ContainerStarted","Data":"dfff1598cc8d75b46f9c7d9b0b20c7da7a411530b119771f2a6a08b709179b66"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.509532 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-58879495c-l2dll" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.511889 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8" event={"ID":"88f32422-f5bd-4fd8-85d1-ff2d6ccc1633","Type":"ContainerStarted","Data":"ce55490aa593454cb9f837b60ba4e1f86e1caf22d55ee75865088d8de67bbfd6"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.512279 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.516035 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc" event={"ID":"20583712-205a-4875-aef3-0052b1dc4382","Type":"ContainerStarted","Data":"a908f666a2bea50fda52d41a67c3009c8c6021650e60435df12d496756a29f35"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.516162 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.517641 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" event={"ID":"91f28ab0-ea37-4fef-87f2-4150127c276e","Type":"ContainerStarted","Data":"fe50787dfc138512ac34d86b0f06224a1b49ed94c927d27599aabb5ed2287f0e"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.517749 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.519097 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8" event={"ID":"16115cae-adf6-4065-90c9-082ab050dc96","Type":"ContainerStarted","Data":"93af916e654bcff1e0083dee1d1347dd3fd9e0d064fecee112589db4e9ca3d9e"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.519215 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.521122 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx" event={"ID":"b6e33671-c04d-4fc0-825d-13355e317733","Type":"ContainerStarted","Data":"2b2eb0b2b50c12dbbeb80b88ff43e013a79a2a9efe20c191fa2cf768262a9a89"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.521656 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.533438 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-58879495c-l2dll" podStartSLOduration=4.166813942 podStartE2EDuration="18.533422466s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.152678231 +0000 UTC m=+1048.004700669" lastFinishedPulling="2025-12-15 14:11:40.519286745 +0000 UTC m=+1062.371309193" observedRunningTime="2025-12-15 14:11:42.530981097 +0000 UTC m=+1064.383003545" watchObservedRunningTime="2025-12-15 14:11:42.533422466 +0000 UTC m=+1064.385444914" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.543845 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb" event={"ID":"e573b9b7-1b6c-40d6-93e0-c9103105034d","Type":"ContainerStarted","Data":"182aa6fcd12d12e98a1d94b44734f6a81fdc5b9aa93193ffdf62c86ab2fa023d"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.543894 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb" event={"ID":"e573b9b7-1b6c-40d6-93e0-c9103105034d","Type":"ContainerStarted","Data":"e85f4486a9c209af8f7abb763619818a12ef21147e5911c644c9d7d0723c2580"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.544375 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.557403 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8" podStartSLOduration=6.998938546 podStartE2EDuration="18.557389023s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.182452701 +0000 UTC m=+1048.034475139" lastFinishedPulling="2025-12-15 14:11:37.740903168 +0000 UTC m=+1059.592925616" observedRunningTime="2025-12-15 14:11:42.551476826 +0000 UTC m=+1064.403499264" watchObservedRunningTime="2025-12-15 14:11:42.557389023 +0000 UTC m=+1064.409411461" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.566590 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz" event={"ID":"9d2c6c2d-8ff4-416a-9cc6-447d855fd954","Type":"ContainerStarted","Data":"fe43de198dd88737e0c0fca3c32b4a550d1e7cedd7e32342b4a4c5cb097c3036"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.566734 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.577282 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8" podStartSLOduration=5.547983354 podStartE2EDuration="18.577262114s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:25.553710503 +0000 UTC m=+1047.405732941" lastFinishedPulling="2025-12-15 14:11:38.582989223 +0000 UTC m=+1060.435011701" observedRunningTime="2025-12-15 14:11:42.570626797 +0000 UTC m=+1064.422649245" watchObservedRunningTime="2025-12-15 14:11:42.577262114 +0000 UTC m=+1064.429284552" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.579858 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm" event={"ID":"1a552228-3637-48fa-b860-64f1d63d9726","Type":"ContainerStarted","Data":"89c87d8d18a2796abb56d5a7f168381f39b55b88a44506250358959a146f7204"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.580508 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.582674 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z" event={"ID":"02e2d0b6-2d72-4aa5-9a71-2c0fb9fcf262","Type":"ContainerStarted","Data":"3a5c16ea48553ac11fdeedfe8179b7d30c73299f103d3241460715d72c87929d"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.583060 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.584557 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg" event={"ID":"488f9339-bc6f-419a-acdc-c4601f5f0d04","Type":"ContainerStarted","Data":"88d734f73a09e24ee33b8ab93c63b97bcf357c950a096b73d6b3de6410fee7b6"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.584981 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.587737 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck" event={"ID":"d94816b6-2a4c-44fa-a7c0-811c18ec190d","Type":"ContainerStarted","Data":"400552b3d17969f7465c9ec7eb0ec8e8fc2ce4462bc7456648bff2b7e3f63d58"} Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.587788 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck" Dec 15 14:11:42 crc kubenswrapper[4794]: E1215 14:11:42.588373 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:177bba84f71a0b2cfd00a31147aa349fe4c25c83d2b9df7563b5dd5cfeafc161\\\"\"" pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" podUID="84d790b6-ef4b-449b-80eb-0bc812ed496f" Dec 15 14:11:42 crc kubenswrapper[4794]: E1215 14:11:42.588762 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e2b7b9bdbf93b2ff7012cd2af921ae43082fd3eb036d884f13292c2e56f505c\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" podUID="c52870d2-d447-44d4-b68c-420d695b65a0" Dec 15 14:11:42 crc kubenswrapper[4794]: E1215 14:11:42.589252 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:03833d5c6982b42c836787ee1863f5f73e20dce26a154171de6c0cf4712938b5\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" podUID="070970b1-bb19-4aa4-b544-241064874029" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.593649 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc" podStartSLOduration=4.179330446 podStartE2EDuration="18.593636137s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.184832809 +0000 UTC m=+1048.036855237" lastFinishedPulling="2025-12-15 14:11:40.59913848 +0000 UTC m=+1062.451160928" observedRunningTime="2025-12-15 14:11:42.591680642 +0000 UTC m=+1064.443703080" watchObservedRunningTime="2025-12-15 14:11:42.593636137 +0000 UTC m=+1064.445658595" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.606988 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx" podStartSLOduration=4.119531486 podStartE2EDuration="18.606954003s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.111925719 +0000 UTC m=+1047.963948157" lastFinishedPulling="2025-12-15 14:11:40.599348196 +0000 UTC m=+1062.451370674" observedRunningTime="2025-12-15 14:11:42.60542309 +0000 UTC m=+1064.457445538" watchObservedRunningTime="2025-12-15 14:11:42.606954003 +0000 UTC m=+1064.458976441" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.653030 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" podStartSLOduration=4.535412733 podStartE2EDuration="18.653010454s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.417751478 +0000 UTC m=+1048.269773916" lastFinishedPulling="2025-12-15 14:11:40.535349199 +0000 UTC m=+1062.387371637" observedRunningTime="2025-12-15 14:11:42.628555783 +0000 UTC m=+1064.480578221" watchObservedRunningTime="2025-12-15 14:11:42.653010454 +0000 UTC m=+1064.505032882" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.653612 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg" podStartSLOduration=3.401779744 podStartE2EDuration="18.653604301s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:25.346396347 +0000 UTC m=+1047.198418785" lastFinishedPulling="2025-12-15 14:11:40.598220894 +0000 UTC m=+1062.450243342" observedRunningTime="2025-12-15 14:11:42.649379721 +0000 UTC m=+1064.501402179" watchObservedRunningTime="2025-12-15 14:11:42.653604301 +0000 UTC m=+1064.505626739" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.692653 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz" podStartSLOduration=4.079556148 podStartE2EDuration="18.692635593s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:25.976780443 +0000 UTC m=+1047.828802881" lastFinishedPulling="2025-12-15 14:11:40.589859858 +0000 UTC m=+1062.441882326" observedRunningTime="2025-12-15 14:11:42.686942812 +0000 UTC m=+1064.538965250" watchObservedRunningTime="2025-12-15 14:11:42.692635593 +0000 UTC m=+1064.544658031" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.721597 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb" podStartSLOduration=3.635018652 podStartE2EDuration="18.721568161s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:25.513879528 +0000 UTC m=+1047.365901966" lastFinishedPulling="2025-12-15 14:11:40.600428997 +0000 UTC m=+1062.452451475" observedRunningTime="2025-12-15 14:11:42.719368428 +0000 UTC m=+1064.571390866" watchObservedRunningTime="2025-12-15 14:11:42.721568161 +0000 UTC m=+1064.573590599" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.766405 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck" podStartSLOduration=4.278098075 podStartE2EDuration="18.766332825s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.055175186 +0000 UTC m=+1047.907197624" lastFinishedPulling="2025-12-15 14:11:40.543409936 +0000 UTC m=+1062.395432374" observedRunningTime="2025-12-15 14:11:42.7530765 +0000 UTC m=+1064.605098948" watchObservedRunningTime="2025-12-15 14:11:42.766332825 +0000 UTC m=+1064.618355283" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.814116 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm" podStartSLOduration=4.402229171 podStartE2EDuration="18.814096654s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.123531937 +0000 UTC m=+1047.975554375" lastFinishedPulling="2025-12-15 14:11:40.53539938 +0000 UTC m=+1062.387421858" observedRunningTime="2025-12-15 14:11:42.791877056 +0000 UTC m=+1064.643899494" watchObservedRunningTime="2025-12-15 14:11:42.814096654 +0000 UTC m=+1064.666119092" Dec 15 14:11:42 crc kubenswrapper[4794]: I1215 14:11:42.816949 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z" podStartSLOduration=3.5730813120000002 podStartE2EDuration="18.816942434s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:25.353885749 +0000 UTC m=+1047.205908187" lastFinishedPulling="2025-12-15 14:11:40.597746841 +0000 UTC m=+1062.449769309" observedRunningTime="2025-12-15 14:11:42.803912986 +0000 UTC m=+1064.655935424" watchObservedRunningTime="2025-12-15 14:11:42.816942434 +0000 UTC m=+1064.668964872" Dec 15 14:11:45 crc kubenswrapper[4794]: I1215 14:11:45.135309 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-6bmz8" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.637885 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" event={"ID":"9f9e1543-36d5-427f-b1cc-3eb9baa9d826","Type":"ContainerStarted","Data":"f4f620467950178f1f95ab824ad88f2358f270529ec801dfe6083ad9a70085e8"} Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.638850 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.640267 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" event={"ID":"c7e6e262-54be-48b7-8e26-098358cab436","Type":"ContainerStarted","Data":"9758426a065d96ba7adc3d4bf01448ded2a1db0324d71e5db75f15c3ba36be1d"} Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.644506 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.663937 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" event={"ID":"d36dacd3-9670-4927-a93f-cbc50b901ef5","Type":"ContainerStarted","Data":"d267b25f8fe2498094e56ff1f1239ddeb6e82fbf802c2cc49ef1c5d902b7a6e6"} Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.664644 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.669519 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" podStartSLOduration=3.06004324 podStartE2EDuration="22.669508223s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.185147328 +0000 UTC m=+1048.037169766" lastFinishedPulling="2025-12-15 14:11:45.794612311 +0000 UTC m=+1067.646634749" observedRunningTime="2025-12-15 14:11:46.668138454 +0000 UTC m=+1068.520160902" watchObservedRunningTime="2025-12-15 14:11:46.669508223 +0000 UTC m=+1068.521530661" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.671019 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" event={"ID":"9bd54690-941b-45f7-b42d-a9b5f8ebc065","Type":"ContainerStarted","Data":"d1848d89149936abb6c3811fa85171caca3b0c77c2727c92dedf285a6a8ef854"} Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.671796 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.676440 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" event={"ID":"0dca5b13-a635-4513-988f-48091076cff9","Type":"ContainerStarted","Data":"7edfe2f4427772caeff73ec4709490eeb460964b7b771a41dabae0fb953622d5"} Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.676821 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.678146 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" event={"ID":"5e547ae0-d6e7-4dd7-b6c1-731554f36f8d","Type":"ContainerStarted","Data":"ae636f7c12bc8d93547f8ffed65d2554321ebaecbc9ece25065da322a0f4754b"} Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.678741 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.683047 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" podStartSLOduration=3.072370298 podStartE2EDuration="22.683036585s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.185046385 +0000 UTC m=+1048.037068823" lastFinishedPulling="2025-12-15 14:11:45.795712672 +0000 UTC m=+1067.647735110" observedRunningTime="2025-12-15 14:11:46.681257865 +0000 UTC m=+1068.533280303" watchObservedRunningTime="2025-12-15 14:11:46.683036585 +0000 UTC m=+1068.535059023" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.700201 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" podStartSLOduration=3.346466831 podStartE2EDuration="22.699909202s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.442320842 +0000 UTC m=+1048.294343290" lastFinishedPulling="2025-12-15 14:11:45.795763213 +0000 UTC m=+1067.647785661" observedRunningTime="2025-12-15 14:11:46.699329255 +0000 UTC m=+1068.551351723" watchObservedRunningTime="2025-12-15 14:11:46.699909202 +0000 UTC m=+1068.551931640" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.720184 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" podStartSLOduration=3.378561876 podStartE2EDuration="22.720168974s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.429707025 +0000 UTC m=+1048.281729463" lastFinishedPulling="2025-12-15 14:11:45.771314123 +0000 UTC m=+1067.623336561" observedRunningTime="2025-12-15 14:11:46.714831553 +0000 UTC m=+1068.566854011" watchObservedRunningTime="2025-12-15 14:11:46.720168974 +0000 UTC m=+1068.572191412" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.732290 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" podStartSLOduration=3.236404111 podStartE2EDuration="22.732276066s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.277323121 +0000 UTC m=+1048.129345559" lastFinishedPulling="2025-12-15 14:11:45.773195076 +0000 UTC m=+1067.625217514" observedRunningTime="2025-12-15 14:11:46.729896229 +0000 UTC m=+1068.581918657" watchObservedRunningTime="2025-12-15 14:11:46.732276066 +0000 UTC m=+1068.584298494" Dec 15 14:11:46 crc kubenswrapper[4794]: I1215 14:11:46.758121 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" podStartSLOduration=3.441693509 podStartE2EDuration="22.758105025s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.485038098 +0000 UTC m=+1048.337060546" lastFinishedPulling="2025-12-15 14:11:45.801449614 +0000 UTC m=+1067.653472062" observedRunningTime="2025-12-15 14:11:46.753858656 +0000 UTC m=+1068.605881094" watchObservedRunningTime="2025-12-15 14:11:46.758105025 +0000 UTC m=+1068.610127463" Dec 15 14:11:49 crc kubenswrapper[4794]: I1215 14:11:49.706952 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx" event={"ID":"5c962533-9c6b-459d-92b7-768ca6a8b110","Type":"ContainerStarted","Data":"94e9912b87b0f56b67b0bb208525b467d0c7541a671694cad505673549911fce"} Dec 15 14:11:49 crc kubenswrapper[4794]: I1215 14:11:49.736020 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx" podStartSLOduration=1.99101751 podStartE2EDuration="24.736000218s" podCreationTimestamp="2025-12-15 14:11:25 +0000 UTC" firstStartedPulling="2025-12-15 14:11:26.250204195 +0000 UTC m=+1048.102226633" lastFinishedPulling="2025-12-15 14:11:48.995186893 +0000 UTC m=+1070.847209341" observedRunningTime="2025-12-15 14:11:49.729310289 +0000 UTC m=+1071.581332747" watchObservedRunningTime="2025-12-15 14:11:49.736000218 +0000 UTC m=+1071.588022666" Dec 15 14:11:54 crc kubenswrapper[4794]: I1215 14:11:54.686853 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-69977bdf55-4tl5z" Dec 15 14:11:54 crc kubenswrapper[4794]: I1215 14:11:54.703144 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5847f67c56-vg9n8" Dec 15 14:11:54 crc kubenswrapper[4794]: I1215 14:11:54.703687 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-7b45cd6d68-dsxzg" Dec 15 14:11:54 crc kubenswrapper[4794]: I1215 14:11:54.828324 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6985cf78fb-zpwbb" Dec 15 14:11:54 crc kubenswrapper[4794]: I1215 14:11:54.928332 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7f764db9b-kxkjz" Dec 15 14:11:54 crc kubenswrapper[4794]: I1215 14:11:54.932978 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7cc599445b-p7sck" Dec 15 14:11:54 crc kubenswrapper[4794]: I1215 14:11:54.944846 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-nq78b" Dec 15 14:11:54 crc kubenswrapper[4794]: I1215 14:11:54.975174 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-669b58f65-c97jv" Dec 15 14:11:55 crc kubenswrapper[4794]: I1215 14:11:55.095482 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-58879495c-l2dll" Dec 15 14:11:55 crc kubenswrapper[4794]: I1215 14:11:55.123720 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-269nm" Dec 15 14:11:55 crc kubenswrapper[4794]: I1215 14:11:55.208502 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-cc776f956-tmnhc" Dec 15 14:11:55 crc kubenswrapper[4794]: I1215 14:11:55.267664 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7c9ff8845d-tgfnx" Dec 15 14:11:55 crc kubenswrapper[4794]: I1215 14:11:55.297848 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6bc5b9c47-drrxw" Dec 15 14:11:55 crc kubenswrapper[4794]: I1215 14:11:55.372068 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-85d55b5858-r7v8b" Dec 15 14:11:55 crc kubenswrapper[4794]: I1215 14:11:55.435178 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5d79c6465c-mp2hm" Dec 15 14:11:55 crc kubenswrapper[4794]: I1215 14:11:55.443129 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" Dec 15 14:11:55 crc kubenswrapper[4794]: I1215 14:11:55.767770 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh" Dec 15 14:12:07 crc kubenswrapper[4794]: I1215 14:12:07.943501 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" event={"ID":"84d790b6-ef4b-449b-80eb-0bc812ed496f","Type":"ContainerStarted","Data":"5614e1de3a0cebdef30c588d112b9152ee3a452ad5541acad174d5d7e6f9ffe0"} Dec 15 14:12:07 crc kubenswrapper[4794]: I1215 14:12:07.945289 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" Dec 15 14:12:07 crc kubenswrapper[4794]: I1215 14:12:07.947046 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" event={"ID":"070970b1-bb19-4aa4-b544-241064874029","Type":"ContainerStarted","Data":"0e0bb8a9c1d4eec9c2d416854ac3a9f05a2b0385cf7c98a3b41aca928655387b"} Dec 15 14:12:07 crc kubenswrapper[4794]: I1215 14:12:07.947200 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" Dec 15 14:12:07 crc kubenswrapper[4794]: I1215 14:12:07.949475 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" event={"ID":"c52870d2-d447-44d4-b68c-420d695b65a0","Type":"ContainerStarted","Data":"9e4caa07cb9bde87757b956609ef6bca701553633cb743105ecae01df666c686"} Dec 15 14:12:07 crc kubenswrapper[4794]: I1215 14:12:07.949621 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" Dec 15 14:12:07 crc kubenswrapper[4794]: I1215 14:12:07.964744 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" podStartSLOduration=2.259021855 podStartE2EDuration="43.964719341s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:25.976752232 +0000 UTC m=+1047.828774670" lastFinishedPulling="2025-12-15 14:12:07.682449728 +0000 UTC m=+1089.534472156" observedRunningTime="2025-12-15 14:12:07.963447735 +0000 UTC m=+1089.815470183" watchObservedRunningTime="2025-12-15 14:12:07.964719341 +0000 UTC m=+1089.816741799" Dec 15 14:12:07 crc kubenswrapper[4794]: I1215 14:12:07.982808 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" podStartSLOduration=2.008864859 podStartE2EDuration="43.982787301s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:25.706362645 +0000 UTC m=+1047.558385073" lastFinishedPulling="2025-12-15 14:12:07.680285077 +0000 UTC m=+1089.532307515" observedRunningTime="2025-12-15 14:12:07.98025053 +0000 UTC m=+1089.832272968" watchObservedRunningTime="2025-12-15 14:12:07.982787301 +0000 UTC m=+1089.834809739" Dec 15 14:12:07 crc kubenswrapper[4794]: I1215 14:12:07.999505 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" podStartSLOduration=2.025666024 podStartE2EDuration="43.999486613s" podCreationTimestamp="2025-12-15 14:11:24 +0000 UTC" firstStartedPulling="2025-12-15 14:11:25.706755646 +0000 UTC m=+1047.558778084" lastFinishedPulling="2025-12-15 14:12:07.680576195 +0000 UTC m=+1089.532598673" observedRunningTime="2025-12-15 14:12:07.998741902 +0000 UTC m=+1089.850764360" watchObservedRunningTime="2025-12-15 14:12:07.999486613 +0000 UTC m=+1089.851509051" Dec 15 14:12:14 crc kubenswrapper[4794]: I1215 14:12:14.848524 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-54fd9dc4b5-bqsrj" Dec 15 14:12:14 crc kubenswrapper[4794]: I1215 14:12:14.946164 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-bb565c8dd-qt4th" Dec 15 14:12:15 crc kubenswrapper[4794]: I1215 14:12:15.106450 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-6b444986fd-fjddz" Dec 15 14:12:19 crc kubenswrapper[4794]: I1215 14:12:19.941398 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp"] Dec 15 14:12:19 crc kubenswrapper[4794]: I1215 14:12:19.942100 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" containerName="kube-rbac-proxy" containerID="cri-o://b9eb37e2979547d7db8c0ff0e323f330da1893b513910b75ce6c4e42f532605b" gracePeriod=10 Dec 15 14:12:19 crc kubenswrapper[4794]: I1215 14:12:19.942190 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" containerName="manager" containerID="cri-o://d1848d89149936abb6c3811fa85171caca3b0c77c2727c92dedf285a6a8ef854" gracePeriod=10 Dec 15 14:12:19 crc kubenswrapper[4794]: I1215 14:12:19.987817 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5"] Dec 15 14:12:19 crc kubenswrapper[4794]: I1215 14:12:19.989349 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" podUID="46324c1c-ee38-4acf-b149-6a6a970e0df4" containerName="operator" containerID="cri-o://81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079" gracePeriod=10 Dec 15 14:12:19 crc kubenswrapper[4794]: I1215 14:12:19.989405 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" podUID="46324c1c-ee38-4acf-b149-6a6a970e0df4" containerName="kube-rbac-proxy" containerID="cri-o://de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c" gracePeriod=10 Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.082301 4794 generic.go:334] "Generic (PLEG): container finished" podID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" containerID="d1848d89149936abb6c3811fa85171caca3b0c77c2727c92dedf285a6a8ef854" exitCode=0 Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.082557 4794 generic.go:334] "Generic (PLEG): container finished" podID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" containerID="b9eb37e2979547d7db8c0ff0e323f330da1893b513910b75ce6c4e42f532605b" exitCode=0 Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.082460 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" event={"ID":"9bd54690-941b-45f7-b42d-a9b5f8ebc065","Type":"ContainerDied","Data":"d1848d89149936abb6c3811fa85171caca3b0c77c2727c92dedf285a6a8ef854"} Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.082721 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" event={"ID":"9bd54690-941b-45f7-b42d-a9b5f8ebc065","Type":"ContainerDied","Data":"b9eb37e2979547d7db8c0ff0e323f330da1893b513910b75ce6c4e42f532605b"} Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.485667 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.492283 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.560661 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8xmk\" (UniqueName: \"kubernetes.io/projected/46324c1c-ee38-4acf-b149-6a6a970e0df4-kube-api-access-b8xmk\") pod \"46324c1c-ee38-4acf-b149-6a6a970e0df4\" (UID: \"46324c1c-ee38-4acf-b149-6a6a970e0df4\") " Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.565504 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46324c1c-ee38-4acf-b149-6a6a970e0df4-kube-api-access-b8xmk" (OuterVolumeSpecName: "kube-api-access-b8xmk") pod "46324c1c-ee38-4acf-b149-6a6a970e0df4" (UID: "46324c1c-ee38-4acf-b149-6a6a970e0df4"). InnerVolumeSpecName "kube-api-access-b8xmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.662495 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvf56\" (UniqueName: \"kubernetes.io/projected/9bd54690-941b-45f7-b42d-a9b5f8ebc065-kube-api-access-lvf56\") pod \"9bd54690-941b-45f7-b42d-a9b5f8ebc065\" (UID: \"9bd54690-941b-45f7-b42d-a9b5f8ebc065\") " Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.662839 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8xmk\" (UniqueName: \"kubernetes.io/projected/46324c1c-ee38-4acf-b149-6a6a970e0df4-kube-api-access-b8xmk\") on node \"crc\" DevicePath \"\"" Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.668733 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bd54690-941b-45f7-b42d-a9b5f8ebc065-kube-api-access-lvf56" (OuterVolumeSpecName: "kube-api-access-lvf56") pod "9bd54690-941b-45f7-b42d-a9b5f8ebc065" (UID: "9bd54690-941b-45f7-b42d-a9b5f8ebc065"). InnerVolumeSpecName "kube-api-access-lvf56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:12:20 crc kubenswrapper[4794]: I1215 14:12:20.764385 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvf56\" (UniqueName: \"kubernetes.io/projected/9bd54690-941b-45f7-b42d-a9b5f8ebc065-kube-api-access-lvf56\") on node \"crc\" DevicePath \"\"" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.093389 4794 generic.go:334] "Generic (PLEG): container finished" podID="46324c1c-ee38-4acf-b149-6a6a970e0df4" containerID="de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c" exitCode=0 Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.093416 4794 generic.go:334] "Generic (PLEG): container finished" podID="46324c1c-ee38-4acf-b149-6a6a970e0df4" containerID="81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079" exitCode=0 Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.093437 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.093459 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" event={"ID":"46324c1c-ee38-4acf-b149-6a6a970e0df4","Type":"ContainerDied","Data":"de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c"} Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.093489 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" event={"ID":"46324c1c-ee38-4acf-b149-6a6a970e0df4","Type":"ContainerDied","Data":"81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079"} Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.093500 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5" event={"ID":"46324c1c-ee38-4acf-b149-6a6a970e0df4","Type":"ContainerDied","Data":"229b451dfdd8da2122e1ec0410df7543b294ef65eec34d5a530fc3c42a9af507"} Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.093514 4794 scope.go:117] "RemoveContainer" containerID="de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.102892 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" event={"ID":"9bd54690-941b-45f7-b42d-a9b5f8ebc065","Type":"ContainerDied","Data":"838cb71dfea876f11b30bb2834d45ba66a2fe7081d67d72513e715641c38a8a4"} Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.103035 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.128231 4794 scope.go:117] "RemoveContainer" containerID="81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.130156 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5"] Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.152039 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59d975664b-tfxz5"] Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.157657 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp"] Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.159860 4794 scope.go:117] "RemoveContainer" containerID="de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c" Dec 15 14:12:21 crc kubenswrapper[4794]: E1215 14:12:21.161898 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c\": container with ID starting with de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c not found: ID does not exist" containerID="de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.161948 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c"} err="failed to get container status \"de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c\": rpc error: code = NotFound desc = could not find container \"de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c\": container with ID starting with de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c not found: ID does not exist" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.161977 4794 scope.go:117] "RemoveContainer" containerID="81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.163011 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7687976674-p8vpp"] Dec 15 14:12:21 crc kubenswrapper[4794]: E1215 14:12:21.163148 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079\": container with ID starting with 81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079 not found: ID does not exist" containerID="81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.163170 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079"} err="failed to get container status \"81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079\": rpc error: code = NotFound desc = could not find container \"81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079\": container with ID starting with 81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079 not found: ID does not exist" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.163185 4794 scope.go:117] "RemoveContainer" containerID="de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.163512 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c"} err="failed to get container status \"de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c\": rpc error: code = NotFound desc = could not find container \"de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c\": container with ID starting with de4f0e13b41944166f67747326dd44a97372c66e02ab70b48195d27581a0d08c not found: ID does not exist" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.163572 4794 scope.go:117] "RemoveContainer" containerID="81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.164135 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079"} err="failed to get container status \"81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079\": rpc error: code = NotFound desc = could not find container \"81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079\": container with ID starting with 81f3ba6864ad8ba92d46e62d37d74a673951ed8575840a9d5a8bdcc966578079 not found: ID does not exist" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.164178 4794 scope.go:117] "RemoveContainer" containerID="d1848d89149936abb6c3811fa85171caca3b0c77c2727c92dedf285a6a8ef854" Dec 15 14:12:21 crc kubenswrapper[4794]: I1215 14:12:21.187397 4794 scope.go:117] "RemoveContainer" containerID="b9eb37e2979547d7db8c0ff0e323f330da1893b513910b75ce6c4e42f532605b" Dec 15 14:12:22 crc kubenswrapper[4794]: I1215 14:12:22.753147 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46324c1c-ee38-4acf-b149-6a6a970e0df4" path="/var/lib/kubelet/pods/46324c1c-ee38-4acf-b149-6a6a970e0df4/volumes" Dec 15 14:12:22 crc kubenswrapper[4794]: I1215 14:12:22.754720 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" path="/var/lib/kubelet/pods/9bd54690-941b-45f7-b42d-a9b5f8ebc065/volumes" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.289287 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-index-sr5v2"] Dec 15 14:12:23 crc kubenswrapper[4794]: E1215 14:12:23.289575 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" containerName="manager" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.289602 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" containerName="manager" Dec 15 14:12:23 crc kubenswrapper[4794]: E1215 14:12:23.289629 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46324c1c-ee38-4acf-b149-6a6a970e0df4" containerName="kube-rbac-proxy" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.289635 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="46324c1c-ee38-4acf-b149-6a6a970e0df4" containerName="kube-rbac-proxy" Dec 15 14:12:23 crc kubenswrapper[4794]: E1215 14:12:23.289646 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" containerName="kube-rbac-proxy" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.289652 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" containerName="kube-rbac-proxy" Dec 15 14:12:23 crc kubenswrapper[4794]: E1215 14:12:23.289664 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46324c1c-ee38-4acf-b149-6a6a970e0df4" containerName="operator" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.289670 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="46324c1c-ee38-4acf-b149-6a6a970e0df4" containerName="operator" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.289810 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="46324c1c-ee38-4acf-b149-6a6a970e0df4" containerName="kube-rbac-proxy" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.289823 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="46324c1c-ee38-4acf-b149-6a6a970e0df4" containerName="operator" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.289837 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" containerName="kube-rbac-proxy" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.289846 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd54690-941b-45f7-b42d-a9b5f8ebc065" containerName="manager" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.290305 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-sr5v2" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.301036 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-index-dockercfg-l49cf" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.306906 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-sr5v2"] Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.400456 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dggwq\" (UniqueName: \"kubernetes.io/projected/5bd930c9-ead7-4313-9f2f-ef3df0d06af2-kube-api-access-dggwq\") pod \"watcher-operator-index-sr5v2\" (UID: \"5bd930c9-ead7-4313-9f2f-ef3df0d06af2\") " pod="openstack-operators/watcher-operator-index-sr5v2" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.502192 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dggwq\" (UniqueName: \"kubernetes.io/projected/5bd930c9-ead7-4313-9f2f-ef3df0d06af2-kube-api-access-dggwq\") pod \"watcher-operator-index-sr5v2\" (UID: \"5bd930c9-ead7-4313-9f2f-ef3df0d06af2\") " pod="openstack-operators/watcher-operator-index-sr5v2" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.521345 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dggwq\" (UniqueName: \"kubernetes.io/projected/5bd930c9-ead7-4313-9f2f-ef3df0d06af2-kube-api-access-dggwq\") pod \"watcher-operator-index-sr5v2\" (UID: \"5bd930c9-ead7-4313-9f2f-ef3df0d06af2\") " pod="openstack-operators/watcher-operator-index-sr5v2" Dec 15 14:12:23 crc kubenswrapper[4794]: I1215 14:12:23.611006 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-sr5v2" Dec 15 14:12:24 crc kubenswrapper[4794]: I1215 14:12:24.095626 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-sr5v2"] Dec 15 14:12:24 crc kubenswrapper[4794]: W1215 14:12:24.099426 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bd930c9_ead7_4313_9f2f_ef3df0d06af2.slice/crio-b709f06dde9a0a429ac3d883aa8dedd50eab3d22e7d74b59156edccff18e1b14 WatchSource:0}: Error finding container b709f06dde9a0a429ac3d883aa8dedd50eab3d22e7d74b59156edccff18e1b14: Status 404 returned error can't find the container with id b709f06dde9a0a429ac3d883aa8dedd50eab3d22e7d74b59156edccff18e1b14 Dec 15 14:12:24 crc kubenswrapper[4794]: I1215 14:12:24.140672 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-sr5v2" event={"ID":"5bd930c9-ead7-4313-9f2f-ef3df0d06af2","Type":"ContainerStarted","Data":"b709f06dde9a0a429ac3d883aa8dedd50eab3d22e7d74b59156edccff18e1b14"} Dec 15 14:12:24 crc kubenswrapper[4794]: I1215 14:12:24.533964 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:12:24 crc kubenswrapper[4794]: I1215 14:12:24.534332 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:12:25 crc kubenswrapper[4794]: I1215 14:12:25.151283 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-sr5v2" event={"ID":"5bd930c9-ead7-4313-9f2f-ef3df0d06af2","Type":"ContainerStarted","Data":"c4867ab337c25f666d07b7328ce7d90af14eacf1d33663aff0de51c6b4b1c6d2"} Dec 15 14:12:25 crc kubenswrapper[4794]: I1215 14:12:25.175293 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-index-sr5v2" podStartSLOduration=1.993894209 podStartE2EDuration="2.175274411s" podCreationTimestamp="2025-12-15 14:12:23 +0000 UTC" firstStartedPulling="2025-12-15 14:12:24.101741709 +0000 UTC m=+1105.953764137" lastFinishedPulling="2025-12-15 14:12:24.283121901 +0000 UTC m=+1106.135144339" observedRunningTime="2025-12-15 14:12:25.173858703 +0000 UTC m=+1107.025881151" watchObservedRunningTime="2025-12-15 14:12:25.175274411 +0000 UTC m=+1107.027296859" Dec 15 14:12:33 crc kubenswrapper[4794]: I1215 14:12:33.611691 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/watcher-operator-index-sr5v2" Dec 15 14:12:33 crc kubenswrapper[4794]: I1215 14:12:33.612175 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-index-sr5v2" Dec 15 14:12:33 crc kubenswrapper[4794]: I1215 14:12:33.652706 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/watcher-operator-index-sr5v2" Dec 15 14:12:34 crc kubenswrapper[4794]: I1215 14:12:34.302832 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-index-sr5v2" Dec 15 14:12:39 crc kubenswrapper[4794]: I1215 14:12:39.936785 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w"] Dec 15 14:12:39 crc kubenswrapper[4794]: I1215 14:12:39.944078 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:39 crc kubenswrapper[4794]: I1215 14:12:39.950770 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9nhp5" Dec 15 14:12:39 crc kubenswrapper[4794]: I1215 14:12:39.962884 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w"] Dec 15 14:12:40 crc kubenswrapper[4794]: I1215 14:12:40.062164 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fnsh\" (UniqueName: \"kubernetes.io/projected/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-kube-api-access-6fnsh\") pod \"71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:40 crc kubenswrapper[4794]: I1215 14:12:40.062485 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-util\") pod \"71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:40 crc kubenswrapper[4794]: I1215 14:12:40.062683 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-bundle\") pod \"71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:40 crc kubenswrapper[4794]: I1215 14:12:40.164236 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fnsh\" (UniqueName: \"kubernetes.io/projected/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-kube-api-access-6fnsh\") pod \"71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:40 crc kubenswrapper[4794]: I1215 14:12:40.164348 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-util\") pod \"71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:40 crc kubenswrapper[4794]: I1215 14:12:40.164450 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-bundle\") pod \"71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:40 crc kubenswrapper[4794]: I1215 14:12:40.165241 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-bundle\") pod \"71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:40 crc kubenswrapper[4794]: I1215 14:12:40.165803 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-util\") pod \"71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:40 crc kubenswrapper[4794]: I1215 14:12:40.192918 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fnsh\" (UniqueName: \"kubernetes.io/projected/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-kube-api-access-6fnsh\") pod \"71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:40 crc kubenswrapper[4794]: I1215 14:12:40.284210 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:40 crc kubenswrapper[4794]: I1215 14:12:40.806342 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w"] Dec 15 14:12:41 crc kubenswrapper[4794]: I1215 14:12:41.299192 4794 generic.go:334] "Generic (PLEG): container finished" podID="86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" containerID="a21615a71fba9d17c81c5c71e786480570540cfb5ecd0c6947de8dbde0ffc4a9" exitCode=0 Dec 15 14:12:41 crc kubenswrapper[4794]: I1215 14:12:41.299249 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" event={"ID":"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512","Type":"ContainerDied","Data":"a21615a71fba9d17c81c5c71e786480570540cfb5ecd0c6947de8dbde0ffc4a9"} Dec 15 14:12:41 crc kubenswrapper[4794]: I1215 14:12:41.299663 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" event={"ID":"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512","Type":"ContainerStarted","Data":"8fe3e1a98b5bdc134e6763dd29fdd8dd57192f35106116b6db0c682ece102eda"} Dec 15 14:12:42 crc kubenswrapper[4794]: I1215 14:12:42.309297 4794 generic.go:334] "Generic (PLEG): container finished" podID="86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" containerID="5d32dd81c4ebd43ca6ac9d13c84b259fffcd3a4d7991956d95f7a84f8996cf18" exitCode=0 Dec 15 14:12:42 crc kubenswrapper[4794]: I1215 14:12:42.309502 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" event={"ID":"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512","Type":"ContainerDied","Data":"5d32dd81c4ebd43ca6ac9d13c84b259fffcd3a4d7991956d95f7a84f8996cf18"} Dec 15 14:12:43 crc kubenswrapper[4794]: I1215 14:12:43.323779 4794 generic.go:334] "Generic (PLEG): container finished" podID="86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" containerID="132c7d9fbd76b70d7d6293e26e111542b5f0624f0dd9d82bf4fc8ef280075f6c" exitCode=0 Dec 15 14:12:43 crc kubenswrapper[4794]: I1215 14:12:43.323839 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" event={"ID":"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512","Type":"ContainerDied","Data":"132c7d9fbd76b70d7d6293e26e111542b5f0624f0dd9d82bf4fc8ef280075f6c"} Dec 15 14:12:44 crc kubenswrapper[4794]: I1215 14:12:44.694733 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:44 crc kubenswrapper[4794]: I1215 14:12:44.835319 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fnsh\" (UniqueName: \"kubernetes.io/projected/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-kube-api-access-6fnsh\") pod \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " Dec 15 14:12:44 crc kubenswrapper[4794]: I1215 14:12:44.835418 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-bundle\") pod \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " Dec 15 14:12:44 crc kubenswrapper[4794]: I1215 14:12:44.835631 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-util\") pod \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\" (UID: \"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512\") " Dec 15 14:12:44 crc kubenswrapper[4794]: I1215 14:12:44.837715 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-bundle" (OuterVolumeSpecName: "bundle") pod "86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" (UID: "86b0d3e1-9b95-41f8-8fa9-27ae9efb3512"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:12:44 crc kubenswrapper[4794]: I1215 14:12:44.846907 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-kube-api-access-6fnsh" (OuterVolumeSpecName: "kube-api-access-6fnsh") pod "86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" (UID: "86b0d3e1-9b95-41f8-8fa9-27ae9efb3512"). InnerVolumeSpecName "kube-api-access-6fnsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:12:44 crc kubenswrapper[4794]: I1215 14:12:44.870376 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-util" (OuterVolumeSpecName: "util") pod "86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" (UID: "86b0d3e1-9b95-41f8-8fa9-27ae9efb3512"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:12:44 crc kubenswrapper[4794]: I1215 14:12:44.939516 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fnsh\" (UniqueName: \"kubernetes.io/projected/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-kube-api-access-6fnsh\") on node \"crc\" DevicePath \"\"" Dec 15 14:12:44 crc kubenswrapper[4794]: I1215 14:12:44.939567 4794 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:12:44 crc kubenswrapper[4794]: I1215 14:12:44.939608 4794 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86b0d3e1-9b95-41f8-8fa9-27ae9efb3512-util\") on node \"crc\" DevicePath \"\"" Dec 15 14:12:45 crc kubenswrapper[4794]: I1215 14:12:45.347931 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" event={"ID":"86b0d3e1-9b95-41f8-8fa9-27ae9efb3512","Type":"ContainerDied","Data":"8fe3e1a98b5bdc134e6763dd29fdd8dd57192f35106116b6db0c682ece102eda"} Dec 15 14:12:45 crc kubenswrapper[4794]: I1215 14:12:45.347986 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fe3e1a98b5bdc134e6763dd29fdd8dd57192f35106116b6db0c682ece102eda" Dec 15 14:12:45 crc kubenswrapper[4794]: I1215 14:12:45.348030 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.167562 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn"] Dec 15 14:12:50 crc kubenswrapper[4794]: E1215 14:12:50.168104 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" containerName="util" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.168116 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" containerName="util" Dec 15 14:12:50 crc kubenswrapper[4794]: E1215 14:12:50.168131 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" containerName="extract" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.168137 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" containerName="extract" Dec 15 14:12:50 crc kubenswrapper[4794]: E1215 14:12:50.168149 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" containerName="pull" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.168155 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" containerName="pull" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.168283 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b0d3e1-9b95-41f8-8fa9-27ae9efb3512" containerName="extract" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.168995 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.171088 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-25hzt" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.177296 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-service-cert" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.192412 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn"] Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.318798 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd9lh\" (UniqueName: \"kubernetes.io/projected/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-kube-api-access-rd9lh\") pod \"watcher-operator-controller-manager-8496c847c5-7vmxn\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.318913 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-webhook-cert\") pod \"watcher-operator-controller-manager-8496c847c5-7vmxn\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.319003 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-apiservice-cert\") pod \"watcher-operator-controller-manager-8496c847c5-7vmxn\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.420481 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-webhook-cert\") pod \"watcher-operator-controller-manager-8496c847c5-7vmxn\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.420567 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-apiservice-cert\") pod \"watcher-operator-controller-manager-8496c847c5-7vmxn\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.420643 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9lh\" (UniqueName: \"kubernetes.io/projected/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-kube-api-access-rd9lh\") pod \"watcher-operator-controller-manager-8496c847c5-7vmxn\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.433808 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-webhook-cert\") pod \"watcher-operator-controller-manager-8496c847c5-7vmxn\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.436307 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-apiservice-cert\") pod \"watcher-operator-controller-manager-8496c847c5-7vmxn\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.452615 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9lh\" (UniqueName: \"kubernetes.io/projected/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-kube-api-access-rd9lh\") pod \"watcher-operator-controller-manager-8496c847c5-7vmxn\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.496956 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:50 crc kubenswrapper[4794]: I1215 14:12:50.965022 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn"] Dec 15 14:12:50 crc kubenswrapper[4794]: W1215 14:12:50.971558 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bbfe68f_68c1_4c98_8ffa_9795cffede8f.slice/crio-206aff16fde753c175fe1962985dc8f566d7dc53a56e9f8bf977a55d71a8c440 WatchSource:0}: Error finding container 206aff16fde753c175fe1962985dc8f566d7dc53a56e9f8bf977a55d71a8c440: Status 404 returned error can't find the container with id 206aff16fde753c175fe1962985dc8f566d7dc53a56e9f8bf977a55d71a8c440 Dec 15 14:12:51 crc kubenswrapper[4794]: I1215 14:12:51.397369 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" event={"ID":"2bbfe68f-68c1-4c98-8ffa-9795cffede8f","Type":"ContainerStarted","Data":"887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd"} Dec 15 14:12:51 crc kubenswrapper[4794]: I1215 14:12:51.397819 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:12:51 crc kubenswrapper[4794]: I1215 14:12:51.397832 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" event={"ID":"2bbfe68f-68c1-4c98-8ffa-9795cffede8f","Type":"ContainerStarted","Data":"a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af"} Dec 15 14:12:51 crc kubenswrapper[4794]: I1215 14:12:51.397842 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" event={"ID":"2bbfe68f-68c1-4c98-8ffa-9795cffede8f","Type":"ContainerStarted","Data":"206aff16fde753c175fe1962985dc8f566d7dc53a56e9f8bf977a55d71a8c440"} Dec 15 14:12:51 crc kubenswrapper[4794]: I1215 14:12:51.418968 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" podStartSLOduration=1.418950465 podStartE2EDuration="1.418950465s" podCreationTimestamp="2025-12-15 14:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:12:51.415144182 +0000 UTC m=+1133.267166630" watchObservedRunningTime="2025-12-15 14:12:51.418950465 +0000 UTC m=+1133.270972903" Dec 15 14:12:54 crc kubenswrapper[4794]: I1215 14:12:54.534697 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:12:54 crc kubenswrapper[4794]: I1215 14:12:54.535354 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:13:00 crc kubenswrapper[4794]: I1215 14:13:00.504164 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.654033 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k"] Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.655783 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.671023 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k"] Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.804329 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htwkn\" (UniqueName: \"kubernetes.io/projected/92dbb135-aa8d-4392-b6b5-53bdfd6d1c40-kube-api-access-htwkn\") pod \"watcher-operator-controller-manager-6cd5749bb8-b2v8k\" (UID: \"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40\") " pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.804403 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92dbb135-aa8d-4392-b6b5-53bdfd6d1c40-webhook-cert\") pod \"watcher-operator-controller-manager-6cd5749bb8-b2v8k\" (UID: \"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40\") " pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.804480 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92dbb135-aa8d-4392-b6b5-53bdfd6d1c40-apiservice-cert\") pod \"watcher-operator-controller-manager-6cd5749bb8-b2v8k\" (UID: \"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40\") " pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.906150 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htwkn\" (UniqueName: \"kubernetes.io/projected/92dbb135-aa8d-4392-b6b5-53bdfd6d1c40-kube-api-access-htwkn\") pod \"watcher-operator-controller-manager-6cd5749bb8-b2v8k\" (UID: \"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40\") " pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.906987 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92dbb135-aa8d-4392-b6b5-53bdfd6d1c40-webhook-cert\") pod \"watcher-operator-controller-manager-6cd5749bb8-b2v8k\" (UID: \"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40\") " pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.907021 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92dbb135-aa8d-4392-b6b5-53bdfd6d1c40-apiservice-cert\") pod \"watcher-operator-controller-manager-6cd5749bb8-b2v8k\" (UID: \"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40\") " pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.912196 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92dbb135-aa8d-4392-b6b5-53bdfd6d1c40-apiservice-cert\") pod \"watcher-operator-controller-manager-6cd5749bb8-b2v8k\" (UID: \"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40\") " pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.913712 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92dbb135-aa8d-4392-b6b5-53bdfd6d1c40-webhook-cert\") pod \"watcher-operator-controller-manager-6cd5749bb8-b2v8k\" (UID: \"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40\") " pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.922885 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htwkn\" (UniqueName: \"kubernetes.io/projected/92dbb135-aa8d-4392-b6b5-53bdfd6d1c40-kube-api-access-htwkn\") pod \"watcher-operator-controller-manager-6cd5749bb8-b2v8k\" (UID: \"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40\") " pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:01 crc kubenswrapper[4794]: I1215 14:13:01.977141 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:02 crc kubenswrapper[4794]: W1215 14:13:02.229302 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92dbb135_aa8d_4392_b6b5_53bdfd6d1c40.slice/crio-4d597cd7d610b1fb18fb0d5f7c4a98a936503748a3ac0d22ad55d4dcd428bd2a WatchSource:0}: Error finding container 4d597cd7d610b1fb18fb0d5f7c4a98a936503748a3ac0d22ad55d4dcd428bd2a: Status 404 returned error can't find the container with id 4d597cd7d610b1fb18fb0d5f7c4a98a936503748a3ac0d22ad55d4dcd428bd2a Dec 15 14:13:02 crc kubenswrapper[4794]: I1215 14:13:02.231133 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k"] Dec 15 14:13:02 crc kubenswrapper[4794]: I1215 14:13:02.488653 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" event={"ID":"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40","Type":"ContainerStarted","Data":"4d597cd7d610b1fb18fb0d5f7c4a98a936503748a3ac0d22ad55d4dcd428bd2a"} Dec 15 14:13:03 crc kubenswrapper[4794]: I1215 14:13:03.499105 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" event={"ID":"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40","Type":"ContainerStarted","Data":"fdc7be14b569e9df0f605ff37e762dbdf5f7ce093b93689b9b9a427dccbdf9f9"} Dec 15 14:13:03 crc kubenswrapper[4794]: I1215 14:13:03.499480 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:03 crc kubenswrapper[4794]: I1215 14:13:03.499500 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" event={"ID":"92dbb135-aa8d-4392-b6b5-53bdfd6d1c40","Type":"ContainerStarted","Data":"231f94b56b7fb4963ff27ecd711795b802ac8770e57a5a5cebcbf57c4e5b29b4"} Dec 15 14:13:03 crc kubenswrapper[4794]: I1215 14:13:03.531390 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" podStartSLOduration=2.531366651 podStartE2EDuration="2.531366651s" podCreationTimestamp="2025-12-15 14:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:13:03.526458558 +0000 UTC m=+1145.378481036" watchObservedRunningTime="2025-12-15 14:13:03.531366651 +0000 UTC m=+1145.383389149" Dec 15 14:13:11 crc kubenswrapper[4794]: I1215 14:13:11.992525 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cd5749bb8-b2v8k" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.091328 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn"] Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.091591 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" podUID="2bbfe68f-68c1-4c98-8ffa-9795cffede8f" containerName="manager" containerID="cri-o://a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af" gracePeriod=10 Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.091740 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" podUID="2bbfe68f-68c1-4c98-8ffa-9795cffede8f" containerName="kube-rbac-proxy" containerID="cri-o://887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd" gracePeriod=10 Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.506916 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.568828 4794 generic.go:334] "Generic (PLEG): container finished" podID="2bbfe68f-68c1-4c98-8ffa-9795cffede8f" containerID="887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd" exitCode=0 Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.568864 4794 generic.go:334] "Generic (PLEG): container finished" podID="2bbfe68f-68c1-4c98-8ffa-9795cffede8f" containerID="a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af" exitCode=0 Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.568887 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" event={"ID":"2bbfe68f-68c1-4c98-8ffa-9795cffede8f","Type":"ContainerDied","Data":"887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd"} Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.568916 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" event={"ID":"2bbfe68f-68c1-4c98-8ffa-9795cffede8f","Type":"ContainerDied","Data":"a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af"} Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.568930 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" event={"ID":"2bbfe68f-68c1-4c98-8ffa-9795cffede8f","Type":"ContainerDied","Data":"206aff16fde753c175fe1962985dc8f566d7dc53a56e9f8bf977a55d71a8c440"} Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.568948 4794 scope.go:117] "RemoveContainer" containerID="887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.569085 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.579276 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-webhook-cert\") pod \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.579465 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd9lh\" (UniqueName: \"kubernetes.io/projected/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-kube-api-access-rd9lh\") pod \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.579497 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-apiservice-cert\") pod \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\" (UID: \"2bbfe68f-68c1-4c98-8ffa-9795cffede8f\") " Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.584347 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-kube-api-access-rd9lh" (OuterVolumeSpecName: "kube-api-access-rd9lh") pod "2bbfe68f-68c1-4c98-8ffa-9795cffede8f" (UID: "2bbfe68f-68c1-4c98-8ffa-9795cffede8f"). InnerVolumeSpecName "kube-api-access-rd9lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.584522 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "2bbfe68f-68c1-4c98-8ffa-9795cffede8f" (UID: "2bbfe68f-68c1-4c98-8ffa-9795cffede8f"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.584600 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "2bbfe68f-68c1-4c98-8ffa-9795cffede8f" (UID: "2bbfe68f-68c1-4c98-8ffa-9795cffede8f"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.585803 4794 scope.go:117] "RemoveContainer" containerID="a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.618838 4794 scope.go:117] "RemoveContainer" containerID="887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd" Dec 15 14:13:12 crc kubenswrapper[4794]: E1215 14:13:12.620355 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd\": container with ID starting with 887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd not found: ID does not exist" containerID="887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.620397 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd"} err="failed to get container status \"887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd\": rpc error: code = NotFound desc = could not find container \"887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd\": container with ID starting with 887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd not found: ID does not exist" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.620423 4794 scope.go:117] "RemoveContainer" containerID="a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af" Dec 15 14:13:12 crc kubenswrapper[4794]: E1215 14:13:12.627167 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af\": container with ID starting with a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af not found: ID does not exist" containerID="a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.627215 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af"} err="failed to get container status \"a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af\": rpc error: code = NotFound desc = could not find container \"a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af\": container with ID starting with a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af not found: ID does not exist" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.627241 4794 scope.go:117] "RemoveContainer" containerID="887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.631214 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd"} err="failed to get container status \"887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd\": rpc error: code = NotFound desc = could not find container \"887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd\": container with ID starting with 887a9e042852e02d338f1575f24315a90c6daa6af6efc4a075159e714cc6c1bd not found: ID does not exist" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.631243 4794 scope.go:117] "RemoveContainer" containerID="a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.631561 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af"} err="failed to get container status \"a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af\": rpc error: code = NotFound desc = could not find container \"a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af\": container with ID starting with a8d8cfeb6a3c76d369a2c2daac935048ce1f41b190cbfd7a8200fb76f0ab34af not found: ID does not exist" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.680724 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd9lh\" (UniqueName: \"kubernetes.io/projected/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-kube-api-access-rd9lh\") on node \"crc\" DevicePath \"\"" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.680763 4794 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.680773 4794 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bbfe68f-68c1-4c98-8ffa-9795cffede8f-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.888875 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn"] Dec 15 14:13:12 crc kubenswrapper[4794]: I1215 14:13:12.893954 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8496c847c5-7vmxn"] Dec 15 14:13:14 crc kubenswrapper[4794]: I1215 14:13:14.747669 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bbfe68f-68c1-4c98-8ffa-9795cffede8f" path="/var/lib/kubelet/pods/2bbfe68f-68c1-4c98-8ffa-9795cffede8f/volumes" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.534571 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.535260 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.535321 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.536114 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69cd9719316ef2e80637a18253154a420b07d0d2cd576a13b5c600dba7da07e9"} pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.536196 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" containerID="cri-o://69cd9719316ef2e80637a18253154a420b07d0d2cd576a13b5c600dba7da07e9" gracePeriod=600 Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.683695 4794 generic.go:334] "Generic (PLEG): container finished" podID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerID="69cd9719316ef2e80637a18253154a420b07d0d2cd576a13b5c600dba7da07e9" exitCode=0 Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.683764 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerDied","Data":"69cd9719316ef2e80637a18253154a420b07d0d2cd576a13b5c600dba7da07e9"} Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.684088 4794 scope.go:117] "RemoveContainer" containerID="412df8a876437db1b8dca58aacf9cc551c4e0ee66a2d4593a2815bb14d689e05" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.718904 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Dec 15 14:13:24 crc kubenswrapper[4794]: E1215 14:13:24.719296 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbfe68f-68c1-4c98-8ffa-9795cffede8f" containerName="kube-rbac-proxy" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.719319 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbfe68f-68c1-4c98-8ffa-9795cffede8f" containerName="kube-rbac-proxy" Dec 15 14:13:24 crc kubenswrapper[4794]: E1215 14:13:24.719339 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbfe68f-68c1-4c98-8ffa-9795cffede8f" containerName="manager" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.719348 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbfe68f-68c1-4c98-8ffa-9795cffede8f" containerName="manager" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.719525 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bbfe68f-68c1-4c98-8ffa-9795cffede8f" containerName="kube-rbac-proxy" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.719544 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bbfe68f-68c1-4c98-8ffa-9795cffede8f" containerName="manager" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.721331 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.723475 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-config-data" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.723696 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-server-conf" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.723849 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openshift-service-ca.crt" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.724265 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"kube-root-ca.crt" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.724453 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-default-user" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.724628 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-plugins-conf" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.724790 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-svc" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.724963 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-erlang-cookie" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.725271 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-server-dockercfg-msk69" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.774849 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.848412 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9kz6\" (UniqueName: \"kubernetes.io/projected/823e3288-4f23-430b-843e-50f2e4230b46-kube-api-access-c9kz6\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.848496 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.848553 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/823e3288-4f23-430b-843e-50f2e4230b46-server-conf\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.848596 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.848649 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.848713 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/823e3288-4f23-430b-843e-50f2e4230b46-pod-info\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.848737 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.848771 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e84cec77-8f82-4355-a83e-a2d7b3e48339\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e84cec77-8f82-4355-a83e-a2d7b3e48339\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.848801 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/823e3288-4f23-430b-843e-50f2e4230b46-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.848847 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/823e3288-4f23-430b-843e-50f2e4230b46-config-data\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.848885 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/823e3288-4f23-430b-843e-50f2e4230b46-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.949543 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/823e3288-4f23-430b-843e-50f2e4230b46-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.949958 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/823e3288-4f23-430b-843e-50f2e4230b46-config-data\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.949993 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/823e3288-4f23-430b-843e-50f2e4230b46-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.950017 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9kz6\" (UniqueName: \"kubernetes.io/projected/823e3288-4f23-430b-843e-50f2e4230b46-kube-api-access-c9kz6\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.950077 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.950108 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/823e3288-4f23-430b-843e-50f2e4230b46-server-conf\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.950126 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.950147 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.950171 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/823e3288-4f23-430b-843e-50f2e4230b46-pod-info\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.950187 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.950218 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e84cec77-8f82-4355-a83e-a2d7b3e48339\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e84cec77-8f82-4355-a83e-a2d7b3e48339\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.950783 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/823e3288-4f23-430b-843e-50f2e4230b46-config-data\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.950809 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/823e3288-4f23-430b-843e-50f2e4230b46-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.951065 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.951406 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/823e3288-4f23-430b-843e-50f2e4230b46-server-conf\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.951634 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.953496 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.953524 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e84cec77-8f82-4355-a83e-a2d7b3e48339\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e84cec77-8f82-4355-a83e-a2d7b3e48339\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c1894f0ffa805974c6865e10ea76545d60776b6050aa7e6ab3ed421ce90d69cd/globalmount\"" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.954843 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/823e3288-4f23-430b-843e-50f2e4230b46-pod-info\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.954875 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.955110 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/823e3288-4f23-430b-843e-50f2e4230b46-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.956212 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/823e3288-4f23-430b-843e-50f2e4230b46-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.971014 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9kz6\" (UniqueName: \"kubernetes.io/projected/823e3288-4f23-430b-843e-50f2e4230b46-kube-api-access-c9kz6\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:24 crc kubenswrapper[4794]: I1215 14:13:24.989707 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e84cec77-8f82-4355-a83e-a2d7b3e48339\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e84cec77-8f82-4355-a83e-a2d7b3e48339\") pod \"rabbitmq-server-0\" (UID: \"823e3288-4f23-430b-843e-50f2e4230b46\") " pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.110493 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.554007 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Dec 15 14:13:25 crc kubenswrapper[4794]: W1215 14:13:25.560731 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod823e3288_4f23_430b_843e_50f2e4230b46.slice/crio-eecf5f4df2a2f526dbc091476c65ab56185f785afb98a5ebb436484a96da19f9 WatchSource:0}: Error finding container eecf5f4df2a2f526dbc091476c65ab56185f785afb98a5ebb436484a96da19f9: Status 404 returned error can't find the container with id eecf5f4df2a2f526dbc091476c65ab56185f785afb98a5ebb436484a96da19f9 Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.696011 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"823e3288-4f23-430b-843e-50f2e4230b46","Type":"ContainerStarted","Data":"eecf5f4df2a2f526dbc091476c65ab56185f785afb98a5ebb436484a96da19f9"} Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.699464 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"29f0e3d8aa7777138c2e7242f8fdf176d9b72c162c2a35610e7f08f0da28f0c2"} Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.747558 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.748813 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.754202 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-config-data" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.754695 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-default-user" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.754820 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-dockercfg-2zpn2" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.762193 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-erlang-cookie" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.762411 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-plugins-conf" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.768646 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-conf" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.769296 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-notifications-svc" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.788135 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.878395 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2eef62de-2115-49c1-bb86-8526606f7a69-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.878443 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndn28\" (UniqueName: \"kubernetes.io/projected/2eef62de-2115-49c1-bb86-8526606f7a69-kube-api-access-ndn28\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.878465 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c5b75a82-767b-4e81-a83d-c8ba01d007d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5b75a82-767b-4e81-a83d-c8ba01d007d4\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.887063 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2eef62de-2115-49c1-bb86-8526606f7a69-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.888039 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.888173 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.888219 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.889324 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.889502 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2eef62de-2115-49c1-bb86-8526606f7a69-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.889605 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2eef62de-2115-49c1-bb86-8526606f7a69-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.889772 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2eef62de-2115-49c1-bb86-8526606f7a69-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.992241 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2eef62de-2115-49c1-bb86-8526606f7a69-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.992294 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2eef62de-2115-49c1-bb86-8526606f7a69-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.992332 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2eef62de-2115-49c1-bb86-8526606f7a69-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.992381 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2eef62de-2115-49c1-bb86-8526606f7a69-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.992410 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndn28\" (UniqueName: \"kubernetes.io/projected/2eef62de-2115-49c1-bb86-8526606f7a69-kube-api-access-ndn28\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.992452 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c5b75a82-767b-4e81-a83d-c8ba01d007d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5b75a82-767b-4e81-a83d-c8ba01d007d4\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.992478 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2eef62de-2115-49c1-bb86-8526606f7a69-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.992503 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.992536 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.992563 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.992637 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.993176 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2eef62de-2115-49c1-bb86-8526606f7a69-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.993211 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.993471 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.993923 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2eef62de-2115-49c1-bb86-8526606f7a69-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.994056 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2eef62de-2115-49c1-bb86-8526606f7a69-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:25 crc kubenswrapper[4794]: I1215 14:13:25.999548 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2eef62de-2115-49c1-bb86-8526606f7a69-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:26 crc kubenswrapper[4794]: I1215 14:13:26.001520 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:26 crc kubenswrapper[4794]: I1215 14:13:26.001896 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2eef62de-2115-49c1-bb86-8526606f7a69-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:26 crc kubenswrapper[4794]: I1215 14:13:26.002863 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 15 14:13:26 crc kubenswrapper[4794]: I1215 14:13:26.002890 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c5b75a82-767b-4e81-a83d-c8ba01d007d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5b75a82-767b-4e81-a83d-c8ba01d007d4\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7fd8f5200b06d6fd4ee8f87e0128b5416424a916f9319642f0b1395f3043bfc9/globalmount\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:26 crc kubenswrapper[4794]: I1215 14:13:26.010220 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2eef62de-2115-49c1-bb86-8526606f7a69-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:26 crc kubenswrapper[4794]: I1215 14:13:26.017114 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndn28\" (UniqueName: \"kubernetes.io/projected/2eef62de-2115-49c1-bb86-8526606f7a69-kube-api-access-ndn28\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:26 crc kubenswrapper[4794]: I1215 14:13:26.032990 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c5b75a82-767b-4e81-a83d-c8ba01d007d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5b75a82-767b-4e81-a83d-c8ba01d007d4\") pod \"rabbitmq-notifications-server-0\" (UID: \"2eef62de-2115-49c1-bb86-8526606f7a69\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:26 crc kubenswrapper[4794]: I1215 14:13:26.083892 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:13:26 crc kubenswrapper[4794]: I1215 14:13:26.329055 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Dec 15 14:13:26 crc kubenswrapper[4794]: I1215 14:13:26.709278 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"2eef62de-2115-49c1-bb86-8526606f7a69","Type":"ContainerStarted","Data":"b2a42f999ce9009447eccacc1af1b462aa1246e4cb3030e09ab7a028655ecea5"} Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.229485 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.231020 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.238353 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"osp-secret" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.238513 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-scripts" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.238553 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config-data" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.238624 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"galera-openstack-dockercfg-v6v8q" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.238807 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-galera-openstack-svc" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.254496 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.273265 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"combined-ca-bundle" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.311272 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd2eff53-fa29-451f-ab58-ea3e9639bbea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.311314 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsndp\" (UniqueName: \"kubernetes.io/projected/cd2eff53-fa29-451f-ab58-ea3e9639bbea-kube-api-access-hsndp\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.311367 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e66a82e5-a8b1-4638-9962-ff07f569eb58\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e66a82e5-a8b1-4638-9962-ff07f569eb58\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.311771 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd2eff53-fa29-451f-ab58-ea3e9639bbea-config-data-default\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.313201 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/cd2eff53-fa29-451f-ab58-ea3e9639bbea-secrets\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.313244 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd2eff53-fa29-451f-ab58-ea3e9639bbea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.313344 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd2eff53-fa29-451f-ab58-ea3e9639bbea-kolla-config\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.313471 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd2eff53-fa29-451f-ab58-ea3e9639bbea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.313562 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd2eff53-fa29-451f-ab58-ea3e9639bbea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.415521 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/cd2eff53-fa29-451f-ab58-ea3e9639bbea-secrets\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.415566 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd2eff53-fa29-451f-ab58-ea3e9639bbea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.415615 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd2eff53-fa29-451f-ab58-ea3e9639bbea-kolla-config\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.417098 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd2eff53-fa29-451f-ab58-ea3e9639bbea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.417182 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd2eff53-fa29-451f-ab58-ea3e9639bbea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.417220 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd2eff53-fa29-451f-ab58-ea3e9639bbea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.417250 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsndp\" (UniqueName: \"kubernetes.io/projected/cd2eff53-fa29-451f-ab58-ea3e9639bbea-kube-api-access-hsndp\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.417339 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e66a82e5-a8b1-4638-9962-ff07f569eb58\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e66a82e5-a8b1-4638-9962-ff07f569eb58\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.417385 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd2eff53-fa29-451f-ab58-ea3e9639bbea-config-data-default\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.419308 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd2eff53-fa29-451f-ab58-ea3e9639bbea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.419613 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd2eff53-fa29-451f-ab58-ea3e9639bbea-kolla-config\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.420867 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd2eff53-fa29-451f-ab58-ea3e9639bbea-config-data-default\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.421815 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd2eff53-fa29-451f-ab58-ea3e9639bbea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.423881 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.423967 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e66a82e5-a8b1-4638-9962-ff07f569eb58\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e66a82e5-a8b1-4638-9962-ff07f569eb58\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ac74eabecde18ce221c59236890df9685c9193e2b82924de940c821cdaec1238/globalmount\"" pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.425104 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd2eff53-fa29-451f-ab58-ea3e9639bbea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.425623 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd2eff53-fa29-451f-ab58-ea3e9639bbea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.449888 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsndp\" (UniqueName: \"kubernetes.io/projected/cd2eff53-fa29-451f-ab58-ea3e9639bbea-kube-api-access-hsndp\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.454480 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/cd2eff53-fa29-451f-ab58-ea3e9639bbea-secrets\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.476601 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e66a82e5-a8b1-4638-9962-ff07f569eb58\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e66a82e5-a8b1-4638-9962-ff07f569eb58\") pod \"openstack-galera-0\" (UID: \"cd2eff53-fa29-451f-ab58-ea3e9639bbea\") " pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.569982 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.571131 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.576347 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.579038 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"memcached-config-data" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.579291 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-svc" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.579467 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"memcached-memcached-dockercfg-z2hg5" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.593686 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.619929 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.619980 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-kolla-config\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.620015 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.620046 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tp9r\" (UniqueName: \"kubernetes.io/projected/6abf8491-0715-4c16-9653-fecbcfd68ed0-kube-api-access-5tp9r\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.620073 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-config-data\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.720873 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tp9r\" (UniqueName: \"kubernetes.io/projected/6abf8491-0715-4c16-9653-fecbcfd68ed0-kube-api-access-5tp9r\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.720926 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-config-data\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.720990 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.721017 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-kolla-config\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.721047 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.723706 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-kolla-config\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.723959 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-config-data\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.728961 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.732180 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.739948 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tp9r\" (UniqueName: \"kubernetes.io/projected/6abf8491-0715-4c16-9653-fecbcfd68ed0-kube-api-access-5tp9r\") pod \"memcached-0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.864096 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.865235 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.867878 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"telemetry-ceilometer-dockercfg-tfg2n" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.879395 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.892317 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:27 crc kubenswrapper[4794]: I1215 14:13:27.924111 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-576fj\" (UniqueName: \"kubernetes.io/projected/de5ca28f-5594-46bd-9bf6-373770d3b9bb-kube-api-access-576fj\") pod \"kube-state-metrics-0\" (UID: \"de5ca28f-5594-46bd-9bf6-373770d3b9bb\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.025234 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-576fj\" (UniqueName: \"kubernetes.io/projected/de5ca28f-5594-46bd-9bf6-373770d3b9bb-kube-api-access-576fj\") pod \"kube-state-metrics-0\" (UID: \"de5ca28f-5594-46bd-9bf6-373770d3b9bb\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.046961 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-576fj\" (UniqueName: \"kubernetes.io/projected/de5ca28f-5594-46bd-9bf6-373770d3b9bb-kube-api-access-576fj\") pod \"kube-state-metrics-0\" (UID: \"de5ca28f-5594-46bd-9bf6-373770d3b9bb\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.181347 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.230361 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.495061 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.496752 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.501060 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-web-config" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.501241 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-cluster-tls-config" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.501333 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-tls-assets-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.501460 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-alertmanager-dockercfg-v6qvh" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.501567 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-generated" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.521623 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.530871 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/689fabcc-a835-471a-9184-728f662139cd-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.530905 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/689fabcc-a835-471a-9184-728f662139cd-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.530944 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/689fabcc-a835-471a-9184-728f662139cd-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.530965 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/689fabcc-a835-471a-9184-728f662139cd-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.531080 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c7w4\" (UniqueName: \"kubernetes.io/projected/689fabcc-a835-471a-9184-728f662139cd-kube-api-access-9c7w4\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.531200 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/689fabcc-a835-471a-9184-728f662139cd-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.531275 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/689fabcc-a835-471a-9184-728f662139cd-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.557808 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 15 14:13:28 crc kubenswrapper[4794]: W1215 14:13:28.590039 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6abf8491_0715_4c16_9653_fecbcfd68ed0.slice/crio-b6bf49192031bd58afdeee7f83ea49488deb805ddc0680dee6324a71482e5813 WatchSource:0}: Error finding container b6bf49192031bd58afdeee7f83ea49488deb805ddc0680dee6324a71482e5813: Status 404 returned error can't find the container with id b6bf49192031bd58afdeee7f83ea49488deb805ddc0680dee6324a71482e5813 Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.635899 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c7w4\" (UniqueName: \"kubernetes.io/projected/689fabcc-a835-471a-9184-728f662139cd-kube-api-access-9c7w4\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.635974 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/689fabcc-a835-471a-9184-728f662139cd-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.636051 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/689fabcc-a835-471a-9184-728f662139cd-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.636141 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/689fabcc-a835-471a-9184-728f662139cd-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.636156 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/689fabcc-a835-471a-9184-728f662139cd-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.636227 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/689fabcc-a835-471a-9184-728f662139cd-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.636259 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/689fabcc-a835-471a-9184-728f662139cd-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.642691 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/689fabcc-a835-471a-9184-728f662139cd-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.642940 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/689fabcc-a835-471a-9184-728f662139cd-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.643988 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/689fabcc-a835-471a-9184-728f662139cd-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.644406 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/689fabcc-a835-471a-9184-728f662139cd-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.645001 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/689fabcc-a835-471a-9184-728f662139cd-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.649144 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/689fabcc-a835-471a-9184-728f662139cd-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.674221 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c7w4\" (UniqueName: \"kubernetes.io/projected/689fabcc-a835-471a-9184-728f662139cd-kube-api-access-9c7w4\") pod \"alertmanager-metric-storage-0\" (UID: \"689fabcc-a835-471a-9184-728f662139cd\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.699326 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 15 14:13:28 crc kubenswrapper[4794]: W1215 14:13:28.711836 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde5ca28f_5594_46bd_9bf6_373770d3b9bb.slice/crio-fc4761e89e6afaad483d0432f2a46b70e88fd1d9305ea2750f3691af6f0a260d WatchSource:0}: Error finding container fc4761e89e6afaad483d0432f2a46b70e88fd1d9305ea2750f3691af6f0a260d: Status 404 returned error can't find the container with id fc4761e89e6afaad483d0432f2a46b70e88fd1d9305ea2750f3691af6f0a260d Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.730552 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"cd2eff53-fa29-451f-ab58-ea3e9639bbea","Type":"ContainerStarted","Data":"b3a1d6207f081e56f8bf7313f02d443a14b167d570112f779a537bbea4bde7d8"} Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.733330 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"6abf8491-0715-4c16-9653-fecbcfd68ed0","Type":"ContainerStarted","Data":"b6bf49192031bd58afdeee7f83ea49488deb805ddc0680dee6324a71482e5813"} Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.736014 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"de5ca28f-5594-46bd-9bf6-373770d3b9bb","Type":"ContainerStarted","Data":"fc4761e89e6afaad483d0432f2a46b70e88fd1d9305ea2750f3691af6f0a260d"} Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.830751 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.881362 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q"] Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.882315 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.886890 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.887450 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-kf4w2" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.908643 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q"] Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.942791 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzwcn\" (UniqueName: \"kubernetes.io/projected/7f2996dd-6dd7-4d33-a893-3e8f27b82ad0-kube-api-access-bzwcn\") pod \"observability-ui-dashboards-7d5fb4cbfb-qrj6q\" (UID: \"7f2996dd-6dd7-4d33-a893-3e8f27b82ad0\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q" Dec 15 14:13:28 crc kubenswrapper[4794]: I1215 14:13:28.942838 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f2996dd-6dd7-4d33-a893-3e8f27b82ad0-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-qrj6q\" (UID: \"7f2996dd-6dd7-4d33-a893-3e8f27b82ad0\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.044630 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f2996dd-6dd7-4d33-a893-3e8f27b82ad0-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-qrj6q\" (UID: \"7f2996dd-6dd7-4d33-a893-3e8f27b82ad0\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.044771 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzwcn\" (UniqueName: \"kubernetes.io/projected/7f2996dd-6dd7-4d33-a893-3e8f27b82ad0-kube-api-access-bzwcn\") pod \"observability-ui-dashboards-7d5fb4cbfb-qrj6q\" (UID: \"7f2996dd-6dd7-4d33-a893-3e8f27b82ad0\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.053088 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f2996dd-6dd7-4d33-a893-3e8f27b82ad0-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-qrj6q\" (UID: \"7f2996dd-6dd7-4d33-a893-3e8f27b82ad0\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.133525 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzwcn\" (UniqueName: \"kubernetes.io/projected/7f2996dd-6dd7-4d33-a893-3e8f27b82ad0-kube-api-access-bzwcn\") pod \"observability-ui-dashboards-7d5fb4cbfb-qrj6q\" (UID: \"7f2996dd-6dd7-4d33-a893-3e8f27b82ad0\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.145454 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.147477 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.151769 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.151983 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.152228 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-crnq2" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.152234 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.152978 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.153730 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.193321 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.250161 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.251703 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.251734 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.251777 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/36c24e40-cfed-49b8-b229-18ff0e56ec7d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.251804 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.251913 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.251944 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5scwt\" (UniqueName: \"kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-kube-api-access-5scwt\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.251966 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-13e5a416-992c-4969-a5af-8edfee1a0676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.251995 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.353048 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.353101 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.353157 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/36c24e40-cfed-49b8-b229-18ff0e56ec7d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.353193 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.353263 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.353301 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5scwt\" (UniqueName: \"kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-kube-api-access-5scwt\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.353329 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-13e5a416-992c-4969-a5af-8edfee1a0676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.353378 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.361186 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.370223 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.370967 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/36c24e40-cfed-49b8-b229-18ff0e56ec7d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.371338 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.376624 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.376767 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-13e5a416-992c-4969-a5af-8edfee1a0676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/28a45d0bd3bc303d2e5ce6038c15d61aa6d12bcb125364e266544ebd0e5c9e47/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.377180 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.377929 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.406260 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5scwt\" (UniqueName: \"kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-kube-api-access-5scwt\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.427855 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6747798b88-tnx6l"] Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.428749 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.461556 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-oauth-serving-cert\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.461643 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-trusted-ca-bundle\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.461667 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-console-oauth-config\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.461698 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-service-ca\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.461725 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-console-serving-cert\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.461766 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgsr8\" (UniqueName: \"kubernetes.io/projected/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-kube-api-access-rgsr8\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.461800 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-console-config\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.503956 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6747798b88-tnx6l"] Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.560342 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-13e5a416-992c-4969-a5af-8edfee1a0676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676\") pod \"prometheus-metric-storage-0\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.564772 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-oauth-serving-cert\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.564821 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-trusted-ca-bundle\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.564842 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-console-oauth-config\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.564880 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-service-ca\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.564905 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-console-serving-cert\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.564986 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgsr8\" (UniqueName: \"kubernetes.io/projected/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-kube-api-access-rgsr8\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.565037 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-console-config\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.565977 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-console-config\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.570531 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-service-ca\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.571823 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-trusted-ca-bundle\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.572312 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-oauth-serving-cert\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.579618 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-console-oauth-config\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.586840 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-console-serving-cert\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.623785 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgsr8\" (UniqueName: \"kubernetes.io/projected/3c90c10a-10d0-4e95-9882-eb9eba5ad9de-kube-api-access-rgsr8\") pod \"console-6747798b88-tnx6l\" (UID: \"3c90c10a-10d0-4e95-9882-eb9eba5ad9de\") " pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.631291 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.779670 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:29 crc kubenswrapper[4794]: I1215 14:13:29.810120 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:13:30 crc kubenswrapper[4794]: I1215 14:13:30.023655 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q"] Dec 15 14:13:30 crc kubenswrapper[4794]: I1215 14:13:30.598465 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 15 14:13:30 crc kubenswrapper[4794]: I1215 14:13:30.690014 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6747798b88-tnx6l"] Dec 15 14:13:30 crc kubenswrapper[4794]: I1215 14:13:30.765095 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"689fabcc-a835-471a-9184-728f662139cd","Type":"ContainerStarted","Data":"df6e420dac8ee737e320303d218bb8ef33ae27e5a88940c6a993fffa97b98773"} Dec 15 14:13:30 crc kubenswrapper[4794]: I1215 14:13:30.766940 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q" event={"ID":"7f2996dd-6dd7-4d33-a893-3e8f27b82ad0","Type":"ContainerStarted","Data":"6e2f8b4cd5f19a2874489919cfdea3f97385a24733636857b819ae9a78f51d91"} Dec 15 14:13:30 crc kubenswrapper[4794]: W1215 14:13:30.792386 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c90c10a_10d0_4e95_9882_eb9eba5ad9de.slice/crio-df83864cd4bef814245f35f17856186002b89414aba2a13d992d014fc23a9755 WatchSource:0}: Error finding container df83864cd4bef814245f35f17856186002b89414aba2a13d992d014fc23a9755: Status 404 returned error can't find the container with id df83864cd4bef814245f35f17856186002b89414aba2a13d992d014fc23a9755 Dec 15 14:13:30 crc kubenswrapper[4794]: W1215 14:13:30.794008 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c24e40_cfed_49b8_b229_18ff0e56ec7d.slice/crio-1ef4f16b7df0a56e525a048d1362d193242fd4c736dc2f8259b59d6e3ff43387 WatchSource:0}: Error finding container 1ef4f16b7df0a56e525a048d1362d193242fd4c736dc2f8259b59d6e3ff43387: Status 404 returned error can't find the container with id 1ef4f16b7df0a56e525a048d1362d193242fd4c736dc2f8259b59d6e3ff43387 Dec 15 14:13:31 crc kubenswrapper[4794]: I1215 14:13:31.775901 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6747798b88-tnx6l" event={"ID":"3c90c10a-10d0-4e95-9882-eb9eba5ad9de","Type":"ContainerStarted","Data":"df83864cd4bef814245f35f17856186002b89414aba2a13d992d014fc23a9755"} Dec 15 14:13:31 crc kubenswrapper[4794]: I1215 14:13:31.777329 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"36c24e40-cfed-49b8-b229-18ff0e56ec7d","Type":"ContainerStarted","Data":"1ef4f16b7df0a56e525a048d1362d193242fd4c736dc2f8259b59d6e3ff43387"} Dec 15 14:13:43 crc kubenswrapper[4794]: I1215 14:13:43.899858 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6747798b88-tnx6l" event={"ID":"3c90c10a-10d0-4e95-9882-eb9eba5ad9de","Type":"ContainerStarted","Data":"3f307ec63bd09b04e513be35fad9d9a9a3f2053bd589fd44cc029487542d71ac"} Dec 15 14:13:43 crc kubenswrapper[4794]: I1215 14:13:43.942280 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6747798b88-tnx6l" podStartSLOduration=14.942261347 podStartE2EDuration="14.942261347s" podCreationTimestamp="2025-12-15 14:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:13:43.937783065 +0000 UTC m=+1185.789805533" watchObservedRunningTime="2025-12-15 14:13:43.942261347 +0000 UTC m=+1185.794283805" Dec 15 14:13:44 crc kubenswrapper[4794]: E1215 14:13:44.632856 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 15 14:13:44 crc kubenswrapper[4794]: E1215 14:13:44.633215 4794 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 15 14:13:44 crc kubenswrapper[4794]: E1215 14:13:44.633337 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=watcher-kuttl-default],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-576fj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_watcher-kuttl-default(de5ca28f-5594-46bd-9bf6-373770d3b9bb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 15 14:13:44 crc kubenswrapper[4794]: E1215 14:13:44.634994 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="de5ca28f-5594-46bd-9bf6-373770d3b9bb" Dec 15 14:13:44 crc kubenswrapper[4794]: E1215 14:13:44.907456 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="de5ca28f-5594-46bd-9bf6-373770d3b9bb" Dec 15 14:13:45 crc kubenswrapper[4794]: I1215 14:13:45.917139 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"6abf8491-0715-4c16-9653-fecbcfd68ed0","Type":"ContainerStarted","Data":"720d2f6082eec541d657b9f1354711b973744db66bbff859554cfe83c3a2ba52"} Dec 15 14:13:45 crc kubenswrapper[4794]: I1215 14:13:45.917517 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:45 crc kubenswrapper[4794]: I1215 14:13:45.919457 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q" event={"ID":"7f2996dd-6dd7-4d33-a893-3e8f27b82ad0","Type":"ContainerStarted","Data":"f37f199be3fffab5512dcd9bf1e45990f30474d2306294a9549789c7a1a6618d"} Dec 15 14:13:45 crc kubenswrapper[4794]: I1215 14:13:45.920979 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"cd2eff53-fa29-451f-ab58-ea3e9639bbea","Type":"ContainerStarted","Data":"2a5fcf0a1757bcb51c93655364bda560b2f0c3ac8f6838c45d42e829fe3e9bb6"} Dec 15 14:13:45 crc kubenswrapper[4794]: I1215 14:13:45.940101 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/memcached-0" podStartSLOduration=3.861958439 podStartE2EDuration="18.940081402s" podCreationTimestamp="2025-12-15 14:13:27 +0000 UTC" firstStartedPulling="2025-12-15 14:13:28.592432905 +0000 UTC m=+1170.444455343" lastFinishedPulling="2025-12-15 14:13:43.670555848 +0000 UTC m=+1185.522578306" observedRunningTime="2025-12-15 14:13:45.938010256 +0000 UTC m=+1187.790032704" watchObservedRunningTime="2025-12-15 14:13:45.940081402 +0000 UTC m=+1187.792103850" Dec 15 14:13:45 crc kubenswrapper[4794]: I1215 14:13:45.967367 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-qrj6q" podStartSLOduration=4.170454911 podStartE2EDuration="17.967346683s" podCreationTimestamp="2025-12-15 14:13:28 +0000 UTC" firstStartedPulling="2025-12-15 14:13:30.094062149 +0000 UTC m=+1171.946084587" lastFinishedPulling="2025-12-15 14:13:43.890953881 +0000 UTC m=+1185.742976359" observedRunningTime="2025-12-15 14:13:45.960130277 +0000 UTC m=+1187.812152715" watchObservedRunningTime="2025-12-15 14:13:45.967346683 +0000 UTC m=+1187.819369131" Dec 15 14:13:46 crc kubenswrapper[4794]: I1215 14:13:46.931489 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"823e3288-4f23-430b-843e-50f2e4230b46","Type":"ContainerStarted","Data":"f7af352a50ef4278a864ae7e74e4f44c01f5e9e51d095a2630a24308082abb8f"} Dec 15 14:13:46 crc kubenswrapper[4794]: I1215 14:13:46.933935 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"2eef62de-2115-49c1-bb86-8526606f7a69","Type":"ContainerStarted","Data":"22e531388ecd699a369fca04188e02da631cbd9fed9dff3fd4978b87f5d961fb"} Dec 15 14:13:47 crc kubenswrapper[4794]: I1215 14:13:47.946254 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"689fabcc-a835-471a-9184-728f662139cd","Type":"ContainerStarted","Data":"45abe1f14dfeb1c5a84e819e9feca334076242e6e001bec07e3003e08c1d118a"} Dec 15 14:13:47 crc kubenswrapper[4794]: I1215 14:13:47.948124 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"36c24e40-cfed-49b8-b229-18ff0e56ec7d","Type":"ContainerStarted","Data":"50ab02348f74c3023acd9e6d2c642bc3a7552ac621494f462ab409ab68cccabb"} Dec 15 14:13:48 crc kubenswrapper[4794]: I1215 14:13:48.956052 4794 generic.go:334] "Generic (PLEG): container finished" podID="cd2eff53-fa29-451f-ab58-ea3e9639bbea" containerID="2a5fcf0a1757bcb51c93655364bda560b2f0c3ac8f6838c45d42e829fe3e9bb6" exitCode=0 Dec 15 14:13:48 crc kubenswrapper[4794]: I1215 14:13:48.956145 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"cd2eff53-fa29-451f-ab58-ea3e9639bbea","Type":"ContainerDied","Data":"2a5fcf0a1757bcb51c93655364bda560b2f0c3ac8f6838c45d42e829fe3e9bb6"} Dec 15 14:13:49 crc kubenswrapper[4794]: I1215 14:13:49.780654 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:49 crc kubenswrapper[4794]: I1215 14:13:49.781507 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:49 crc kubenswrapper[4794]: I1215 14:13:49.790038 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:49 crc kubenswrapper[4794]: I1215 14:13:49.969405 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"cd2eff53-fa29-451f-ab58-ea3e9639bbea","Type":"ContainerStarted","Data":"9a19108c172a22cd6f2b5e873f61795e580c61a1d387d2a38d4a6fe2fa636e40"} Dec 15 14:13:49 crc kubenswrapper[4794]: I1215 14:13:49.974648 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6747798b88-tnx6l" Dec 15 14:13:50 crc kubenswrapper[4794]: I1215 14:13:50.025293 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstack-galera-0" podStartSLOduration=8.616566607 podStartE2EDuration="24.025270209s" podCreationTimestamp="2025-12-15 14:13:26 +0000 UTC" firstStartedPulling="2025-12-15 14:13:28.262990057 +0000 UTC m=+1170.115012495" lastFinishedPulling="2025-12-15 14:13:43.671693639 +0000 UTC m=+1185.523716097" observedRunningTime="2025-12-15 14:13:49.997412521 +0000 UTC m=+1191.849434999" watchObservedRunningTime="2025-12-15 14:13:50.025270209 +0000 UTC m=+1191.877292657" Dec 15 14:13:50 crc kubenswrapper[4794]: I1215 14:13:50.109920 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-548dc59476-42nnj"] Dec 15 14:13:52 crc kubenswrapper[4794]: I1215 14:13:52.893333 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/memcached-0" Dec 15 14:13:54 crc kubenswrapper[4794]: I1215 14:13:54.004211 4794 generic.go:334] "Generic (PLEG): container finished" podID="689fabcc-a835-471a-9184-728f662139cd" containerID="45abe1f14dfeb1c5a84e819e9feca334076242e6e001bec07e3003e08c1d118a" exitCode=0 Dec 15 14:13:54 crc kubenswrapper[4794]: I1215 14:13:54.004276 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"689fabcc-a835-471a-9184-728f662139cd","Type":"ContainerDied","Data":"45abe1f14dfeb1c5a84e819e9feca334076242e6e001bec07e3003e08c1d118a"} Dec 15 14:13:55 crc kubenswrapper[4794]: I1215 14:13:55.017335 4794 generic.go:334] "Generic (PLEG): container finished" podID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerID="50ab02348f74c3023acd9e6d2c642bc3a7552ac621494f462ab409ab68cccabb" exitCode=0 Dec 15 14:13:55 crc kubenswrapper[4794]: I1215 14:13:55.017414 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"36c24e40-cfed-49b8-b229-18ff0e56ec7d","Type":"ContainerDied","Data":"50ab02348f74c3023acd9e6d2c642bc3a7552ac621494f462ab409ab68cccabb"} Dec 15 14:13:57 crc kubenswrapper[4794]: I1215 14:13:57.057826 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"689fabcc-a835-471a-9184-728f662139cd","Type":"ContainerStarted","Data":"1c485142d2c219f62e2f2b19020ad780d8c09033909728830e64e33867b8076a"} Dec 15 14:13:57 crc kubenswrapper[4794]: I1215 14:13:57.059427 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"de5ca28f-5594-46bd-9bf6-373770d3b9bb","Type":"ContainerStarted","Data":"fefa8a802595cff965a530fcb0f20698d23e013791df0a1674fec5880286a3f6"} Dec 15 14:13:57 crc kubenswrapper[4794]: I1215 14:13:57.060349 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:13:57 crc kubenswrapper[4794]: I1215 14:13:57.085445 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=2.5820503439999998 podStartE2EDuration="30.085398351s" podCreationTimestamp="2025-12-15 14:13:27 +0000 UTC" firstStartedPulling="2025-12-15 14:13:28.713261001 +0000 UTC m=+1170.565283429" lastFinishedPulling="2025-12-15 14:13:56.216608998 +0000 UTC m=+1198.068631436" observedRunningTime="2025-12-15 14:13:57.079755408 +0000 UTC m=+1198.931777856" watchObservedRunningTime="2025-12-15 14:13:57.085398351 +0000 UTC m=+1198.937420799" Dec 15 14:13:57 crc kubenswrapper[4794]: I1215 14:13:57.594571 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:57 crc kubenswrapper[4794]: I1215 14:13:57.594631 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:57 crc kubenswrapper[4794]: I1215 14:13:57.664333 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:58 crc kubenswrapper[4794]: I1215 14:13:58.145500 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/openstack-galera-0" Dec 15 14:13:59 crc kubenswrapper[4794]: I1215 14:13:59.088348 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"689fabcc-a835-471a-9184-728f662139cd","Type":"ContainerStarted","Data":"48fa36e6f1dd6210ca05b76ffe2937dd4a9f7e650bde866dd7ee34485fdaf9df"} Dec 15 14:13:59 crc kubenswrapper[4794]: I1215 14:13:59.112939 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/alertmanager-metric-storage-0" podStartSLOduration=4.866851168 podStartE2EDuration="31.112917825s" podCreationTimestamp="2025-12-15 14:13:28 +0000 UTC" firstStartedPulling="2025-12-15 14:13:29.784631425 +0000 UTC m=+1171.636653863" lastFinishedPulling="2025-12-15 14:13:56.030698072 +0000 UTC m=+1197.882720520" observedRunningTime="2025-12-15 14:13:59.112801962 +0000 UTC m=+1200.964824420" watchObservedRunningTime="2025-12-15 14:13:59.112917825 +0000 UTC m=+1200.964940263" Dec 15 14:14:00 crc kubenswrapper[4794]: I1215 14:14:00.098444 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:14:00 crc kubenswrapper[4794]: I1215 14:14:00.101912 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Dec 15 14:14:06 crc kubenswrapper[4794]: I1215 14:14:06.151011 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"36c24e40-cfed-49b8-b229-18ff0e56ec7d","Type":"ContainerStarted","Data":"61a2a5e2b2754b4101b33b35b87e711c3fb9a9dbcf8d8c8a6629a2fd37cd81d2"} Dec 15 14:14:07 crc kubenswrapper[4794]: I1215 14:14:07.658853 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-create-5rt74"] Dec 15 14:14:07 crc kubenswrapper[4794]: I1215 14:14:07.661619 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-5rt74" Dec 15 14:14:07 crc kubenswrapper[4794]: I1215 14:14:07.679916 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-5rt74"] Dec 15 14:14:07 crc kubenswrapper[4794]: I1215 14:14:07.851050 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twsp6\" (UniqueName: \"kubernetes.io/projected/64a0caa7-1566-4c3b-8b9e-4ebc50494940-kube-api-access-twsp6\") pod \"keystone-db-create-5rt74\" (UID: \"64a0caa7-1566-4c3b-8b9e-4ebc50494940\") " pod="watcher-kuttl-default/keystone-db-create-5rt74" Dec 15 14:14:07 crc kubenswrapper[4794]: I1215 14:14:07.952904 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twsp6\" (UniqueName: \"kubernetes.io/projected/64a0caa7-1566-4c3b-8b9e-4ebc50494940-kube-api-access-twsp6\") pod \"keystone-db-create-5rt74\" (UID: \"64a0caa7-1566-4c3b-8b9e-4ebc50494940\") " pod="watcher-kuttl-default/keystone-db-create-5rt74" Dec 15 14:14:07 crc kubenswrapper[4794]: I1215 14:14:07.990955 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twsp6\" (UniqueName: \"kubernetes.io/projected/64a0caa7-1566-4c3b-8b9e-4ebc50494940-kube-api-access-twsp6\") pod \"keystone-db-create-5rt74\" (UID: \"64a0caa7-1566-4c3b-8b9e-4ebc50494940\") " pod="watcher-kuttl-default/keystone-db-create-5rt74" Dec 15 14:14:07 crc kubenswrapper[4794]: I1215 14:14:07.996367 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-5rt74" Dec 15 14:14:08 crc kubenswrapper[4794]: I1215 14:14:08.186517 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:14:08 crc kubenswrapper[4794]: I1215 14:14:08.532598 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-5rt74"] Dec 15 14:14:08 crc kubenswrapper[4794]: W1215 14:14:08.541714 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64a0caa7_1566_4c3b_8b9e_4ebc50494940.slice/crio-7b837100217673b2c369d5f1f9916b461f53a341a6a6a419d5a8e510345137be WatchSource:0}: Error finding container 7b837100217673b2c369d5f1f9916b461f53a341a6a6a419d5a8e510345137be: Status 404 returned error can't find the container with id 7b837100217673b2c369d5f1f9916b461f53a341a6a6a419d5a8e510345137be Dec 15 14:14:09 crc kubenswrapper[4794]: I1215 14:14:09.186070 4794 generic.go:334] "Generic (PLEG): container finished" podID="64a0caa7-1566-4c3b-8b9e-4ebc50494940" containerID="f255e61078f3a7dea1313ae49d02192dd17289a25bb7d23875ec3f362b4816f5" exitCode=0 Dec 15 14:14:09 crc kubenswrapper[4794]: I1215 14:14:09.186184 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-5rt74" event={"ID":"64a0caa7-1566-4c3b-8b9e-4ebc50494940","Type":"ContainerDied","Data":"f255e61078f3a7dea1313ae49d02192dd17289a25bb7d23875ec3f362b4816f5"} Dec 15 14:14:09 crc kubenswrapper[4794]: I1215 14:14:09.186227 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-5rt74" event={"ID":"64a0caa7-1566-4c3b-8b9e-4ebc50494940","Type":"ContainerStarted","Data":"7b837100217673b2c369d5f1f9916b461f53a341a6a6a419d5a8e510345137be"} Dec 15 14:14:09 crc kubenswrapper[4794]: I1215 14:14:09.190517 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"36c24e40-cfed-49b8-b229-18ff0e56ec7d","Type":"ContainerStarted","Data":"a8e7510495ca04e1c3076de490d83e4cc867775a859b41fc5b106055a575d4d8"} Dec 15 14:14:10 crc kubenswrapper[4794]: I1215 14:14:10.524125 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-5rt74" Dec 15 14:14:10 crc kubenswrapper[4794]: I1215 14:14:10.712748 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twsp6\" (UniqueName: \"kubernetes.io/projected/64a0caa7-1566-4c3b-8b9e-4ebc50494940-kube-api-access-twsp6\") pod \"64a0caa7-1566-4c3b-8b9e-4ebc50494940\" (UID: \"64a0caa7-1566-4c3b-8b9e-4ebc50494940\") " Dec 15 14:14:10 crc kubenswrapper[4794]: I1215 14:14:10.724897 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a0caa7-1566-4c3b-8b9e-4ebc50494940-kube-api-access-twsp6" (OuterVolumeSpecName: "kube-api-access-twsp6") pod "64a0caa7-1566-4c3b-8b9e-4ebc50494940" (UID: "64a0caa7-1566-4c3b-8b9e-4ebc50494940"). InnerVolumeSpecName "kube-api-access-twsp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:14:10 crc kubenswrapper[4794]: I1215 14:14:10.814280 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twsp6\" (UniqueName: \"kubernetes.io/projected/64a0caa7-1566-4c3b-8b9e-4ebc50494940-kube-api-access-twsp6\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:11 crc kubenswrapper[4794]: I1215 14:14:11.234613 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-5rt74" event={"ID":"64a0caa7-1566-4c3b-8b9e-4ebc50494940","Type":"ContainerDied","Data":"7b837100217673b2c369d5f1f9916b461f53a341a6a6a419d5a8e510345137be"} Dec 15 14:14:11 crc kubenswrapper[4794]: I1215 14:14:11.234834 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b837100217673b2c369d5f1f9916b461f53a341a6a6a419d5a8e510345137be" Dec 15 14:14:11 crc kubenswrapper[4794]: I1215 14:14:11.234892 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-5rt74" Dec 15 14:14:12 crc kubenswrapper[4794]: I1215 14:14:12.246351 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"36c24e40-cfed-49b8-b229-18ff0e56ec7d","Type":"ContainerStarted","Data":"16814d37a6be92ec52ca02e97b11bc0a7f62de40f6864c67f9167b942df945c0"} Dec 15 14:14:12 crc kubenswrapper[4794]: I1215 14:14:12.311232 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=3.17193797 podStartE2EDuration="44.311211531s" podCreationTimestamp="2025-12-15 14:13:28 +0000 UTC" firstStartedPulling="2025-12-15 14:13:30.79611163 +0000 UTC m=+1172.648134068" lastFinishedPulling="2025-12-15 14:14:11.935385191 +0000 UTC m=+1213.787407629" observedRunningTime="2025-12-15 14:14:12.302927455 +0000 UTC m=+1214.154949933" watchObservedRunningTime="2025-12-15 14:14:12.311211531 +0000 UTC m=+1214.163233979" Dec 15 14:14:14 crc kubenswrapper[4794]: I1215 14:14:14.810937 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:14 crc kubenswrapper[4794]: I1215 14:14:14.811389 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:14 crc kubenswrapper[4794]: I1215 14:14:14.815060 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.158667 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-548dc59476-42nnj" podUID="1f763743-6d60-4726-aa92-05e1e720e035" containerName="console" containerID="cri-o://d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13" gracePeriod=15 Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.282006 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.602323 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-548dc59476-42nnj_1f763743-6d60-4726-aa92-05e1e720e035/console/0.log" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.602599 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.717705 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-console-config\") pod \"1f763743-6d60-4726-aa92-05e1e720e035\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.717788 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk868\" (UniqueName: \"kubernetes.io/projected/1f763743-6d60-4726-aa92-05e1e720e035-kube-api-access-wk868\") pod \"1f763743-6d60-4726-aa92-05e1e720e035\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.717875 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-serving-cert\") pod \"1f763743-6d60-4726-aa92-05e1e720e035\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.717915 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-service-ca\") pod \"1f763743-6d60-4726-aa92-05e1e720e035\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.717961 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-trusted-ca-bundle\") pod \"1f763743-6d60-4726-aa92-05e1e720e035\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.717996 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-oauth-config\") pod \"1f763743-6d60-4726-aa92-05e1e720e035\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.718061 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-oauth-serving-cert\") pod \"1f763743-6d60-4726-aa92-05e1e720e035\" (UID: \"1f763743-6d60-4726-aa92-05e1e720e035\") " Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.719136 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1f763743-6d60-4726-aa92-05e1e720e035" (UID: "1f763743-6d60-4726-aa92-05e1e720e035"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.719162 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-console-config" (OuterVolumeSpecName: "console-config") pod "1f763743-6d60-4726-aa92-05e1e720e035" (UID: "1f763743-6d60-4726-aa92-05e1e720e035"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.719148 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-service-ca" (OuterVolumeSpecName: "service-ca") pod "1f763743-6d60-4726-aa92-05e1e720e035" (UID: "1f763743-6d60-4726-aa92-05e1e720e035"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.719181 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1f763743-6d60-4726-aa92-05e1e720e035" (UID: "1f763743-6d60-4726-aa92-05e1e720e035"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.723207 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1f763743-6d60-4726-aa92-05e1e720e035" (UID: "1f763743-6d60-4726-aa92-05e1e720e035"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.723934 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f763743-6d60-4726-aa92-05e1e720e035-kube-api-access-wk868" (OuterVolumeSpecName: "kube-api-access-wk868") pod "1f763743-6d60-4726-aa92-05e1e720e035" (UID: "1f763743-6d60-4726-aa92-05e1e720e035"). InnerVolumeSpecName "kube-api-access-wk868". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.726190 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1f763743-6d60-4726-aa92-05e1e720e035" (UID: "1f763743-6d60-4726-aa92-05e1e720e035"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.820295 4794 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-service-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.820915 4794 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.821016 4794 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.821119 4794 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.821205 4794 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f763743-6d60-4726-aa92-05e1e720e035-console-config\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.821283 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk868\" (UniqueName: \"kubernetes.io/projected/1f763743-6d60-4726-aa92-05e1e720e035-kube-api-access-wk868\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:15 crc kubenswrapper[4794]: I1215 14:14:15.821402 4794 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f763743-6d60-4726-aa92-05e1e720e035-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:16 crc kubenswrapper[4794]: I1215 14:14:16.287277 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-548dc59476-42nnj_1f763743-6d60-4726-aa92-05e1e720e035/console/0.log" Dec 15 14:14:16 crc kubenswrapper[4794]: I1215 14:14:16.287325 4794 generic.go:334] "Generic (PLEG): container finished" podID="1f763743-6d60-4726-aa92-05e1e720e035" containerID="d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13" exitCode=2 Dec 15 14:14:16 crc kubenswrapper[4794]: I1215 14:14:16.287417 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-548dc59476-42nnj" Dec 15 14:14:16 crc kubenswrapper[4794]: I1215 14:14:16.287465 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-548dc59476-42nnj" event={"ID":"1f763743-6d60-4726-aa92-05e1e720e035","Type":"ContainerDied","Data":"d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13"} Dec 15 14:14:16 crc kubenswrapper[4794]: I1215 14:14:16.287503 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-548dc59476-42nnj" event={"ID":"1f763743-6d60-4726-aa92-05e1e720e035","Type":"ContainerDied","Data":"76206210b7ae086e78dd3f7ac62b4b14196bcf5773e40f5201e81322f508f505"} Dec 15 14:14:16 crc kubenswrapper[4794]: I1215 14:14:16.287521 4794 scope.go:117] "RemoveContainer" containerID="d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13" Dec 15 14:14:16 crc kubenswrapper[4794]: I1215 14:14:16.317034 4794 scope.go:117] "RemoveContainer" containerID="d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13" Dec 15 14:14:16 crc kubenswrapper[4794]: E1215 14:14:16.319194 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13\": container with ID starting with d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13 not found: ID does not exist" containerID="d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13" Dec 15 14:14:16 crc kubenswrapper[4794]: I1215 14:14:16.319239 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13"} err="failed to get container status \"d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13\": rpc error: code = NotFound desc = could not find container \"d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13\": container with ID starting with d908faafd36594f6afc8bafbea89148e300e3683a6377f8c38a0a0048df82e13 not found: ID does not exist" Dec 15 14:14:16 crc kubenswrapper[4794]: I1215 14:14:16.331526 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-548dc59476-42nnj"] Dec 15 14:14:16 crc kubenswrapper[4794]: I1215 14:14:16.353784 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-548dc59476-42nnj"] Dec 15 14:14:16 crc kubenswrapper[4794]: I1215 14:14:16.755520 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f763743-6d60-4726-aa92-05e1e720e035" path="/var/lib/kubelet/pods/1f763743-6d60-4726-aa92-05e1e720e035/volumes" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.628278 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-7b39-account-create-l4qsh"] Dec 15 14:14:17 crc kubenswrapper[4794]: E1215 14:14:17.628688 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f763743-6d60-4726-aa92-05e1e720e035" containerName="console" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.628707 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f763743-6d60-4726-aa92-05e1e720e035" containerName="console" Dec 15 14:14:17 crc kubenswrapper[4794]: E1215 14:14:17.628729 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a0caa7-1566-4c3b-8b9e-4ebc50494940" containerName="mariadb-database-create" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.628737 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a0caa7-1566-4c3b-8b9e-4ebc50494940" containerName="mariadb-database-create" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.628902 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a0caa7-1566-4c3b-8b9e-4ebc50494940" containerName="mariadb-database-create" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.628931 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f763743-6d60-4726-aa92-05e1e720e035" containerName="console" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.629539 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7b39-account-create-l4qsh" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.631836 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-db-secret" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.638321 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-7b39-account-create-l4qsh"] Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.750973 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp2lf\" (UniqueName: \"kubernetes.io/projected/dd2dc9c2-0d41-4fa9-9104-7fe7854a0989-kube-api-access-lp2lf\") pod \"keystone-7b39-account-create-l4qsh\" (UID: \"dd2dc9c2-0d41-4fa9-9104-7fe7854a0989\") " pod="watcher-kuttl-default/keystone-7b39-account-create-l4qsh" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.852274 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp2lf\" (UniqueName: \"kubernetes.io/projected/dd2dc9c2-0d41-4fa9-9104-7fe7854a0989-kube-api-access-lp2lf\") pod \"keystone-7b39-account-create-l4qsh\" (UID: \"dd2dc9c2-0d41-4fa9-9104-7fe7854a0989\") " pod="watcher-kuttl-default/keystone-7b39-account-create-l4qsh" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.873962 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp2lf\" (UniqueName: \"kubernetes.io/projected/dd2dc9c2-0d41-4fa9-9104-7fe7854a0989-kube-api-access-lp2lf\") pod \"keystone-7b39-account-create-l4qsh\" (UID: \"dd2dc9c2-0d41-4fa9-9104-7fe7854a0989\") " pod="watcher-kuttl-default/keystone-7b39-account-create-l4qsh" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.948975 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7b39-account-create-l4qsh" Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.954454 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.954767 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="prometheus" containerID="cri-o://61a2a5e2b2754b4101b33b35b87e711c3fb9a9dbcf8d8c8a6629a2fd37cd81d2" gracePeriod=600 Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.955177 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="thanos-sidecar" containerID="cri-o://16814d37a6be92ec52ca02e97b11bc0a7f62de40f6864c67f9167b942df945c0" gracePeriod=600 Dec 15 14:14:17 crc kubenswrapper[4794]: I1215 14:14:17.955222 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="config-reloader" containerID="cri-o://a8e7510495ca04e1c3076de490d83e4cc867775a859b41fc5b106055a575d4d8" gracePeriod=600 Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.307145 4794 generic.go:334] "Generic (PLEG): container finished" podID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerID="16814d37a6be92ec52ca02e97b11bc0a7f62de40f6864c67f9167b942df945c0" exitCode=0 Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.307779 4794 generic.go:334] "Generic (PLEG): container finished" podID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerID="a8e7510495ca04e1c3076de490d83e4cc867775a859b41fc5b106055a575d4d8" exitCode=0 Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.307800 4794 generic.go:334] "Generic (PLEG): container finished" podID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerID="61a2a5e2b2754b4101b33b35b87e711c3fb9a9dbcf8d8c8a6629a2fd37cd81d2" exitCode=0 Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.307212 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"36c24e40-cfed-49b8-b229-18ff0e56ec7d","Type":"ContainerDied","Data":"16814d37a6be92ec52ca02e97b11bc0a7f62de40f6864c67f9167b942df945c0"} Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.307855 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"36c24e40-cfed-49b8-b229-18ff0e56ec7d","Type":"ContainerDied","Data":"a8e7510495ca04e1c3076de490d83e4cc867775a859b41fc5b106055a575d4d8"} Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.307879 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"36c24e40-cfed-49b8-b229-18ff0e56ec7d","Type":"ContainerDied","Data":"61a2a5e2b2754b4101b33b35b87e711c3fb9a9dbcf8d8c8a6629a2fd37cd81d2"} Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.424738 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-7b39-account-create-l4qsh"] Dec 15 14:14:18 crc kubenswrapper[4794]: W1215 14:14:18.437363 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd2dc9c2_0d41_4fa9_9104_7fe7854a0989.slice/crio-a2437d02a384990962aff046d46566ad3c7b0540217964785c3436d7bf6ce928 WatchSource:0}: Error finding container a2437d02a384990962aff046d46566ad3c7b0540217964785c3436d7bf6ce928: Status 404 returned error can't find the container with id a2437d02a384990962aff046d46566ad3c7b0540217964785c3436d7bf6ce928 Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.800159 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.867167 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5scwt\" (UniqueName: \"kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-kube-api-access-5scwt\") pod \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.867299 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-tls-assets\") pod \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.867319 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config\") pod \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.867338 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-web-config\") pod \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.867366 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config-out\") pod \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.867388 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-thanos-prometheus-http-client-file\") pod \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.867511 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676\") pod \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.867537 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/36c24e40-cfed-49b8-b229-18ff0e56ec7d-prometheus-metric-storage-rulefiles-0\") pod \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\" (UID: \"36c24e40-cfed-49b8-b229-18ff0e56ec7d\") " Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.869523 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36c24e40-cfed-49b8-b229-18ff0e56ec7d-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "36c24e40-cfed-49b8-b229-18ff0e56ec7d" (UID: "36c24e40-cfed-49b8-b229-18ff0e56ec7d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.873451 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-kube-api-access-5scwt" (OuterVolumeSpecName: "kube-api-access-5scwt") pod "36c24e40-cfed-49b8-b229-18ff0e56ec7d" (UID: "36c24e40-cfed-49b8-b229-18ff0e56ec7d"). InnerVolumeSpecName "kube-api-access-5scwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.873916 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "36c24e40-cfed-49b8-b229-18ff0e56ec7d" (UID: "36c24e40-cfed-49b8-b229-18ff0e56ec7d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.876699 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config" (OuterVolumeSpecName: "config") pod "36c24e40-cfed-49b8-b229-18ff0e56ec7d" (UID: "36c24e40-cfed-49b8-b229-18ff0e56ec7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.884457 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "36c24e40-cfed-49b8-b229-18ff0e56ec7d" (UID: "36c24e40-cfed-49b8-b229-18ff0e56ec7d"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.884475 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config-out" (OuterVolumeSpecName: "config-out") pod "36c24e40-cfed-49b8-b229-18ff0e56ec7d" (UID: "36c24e40-cfed-49b8-b229-18ff0e56ec7d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.893935 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-web-config" (OuterVolumeSpecName: "web-config") pod "36c24e40-cfed-49b8-b229-18ff0e56ec7d" (UID: "36c24e40-cfed-49b8-b229-18ff0e56ec7d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.894962 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "36c24e40-cfed-49b8-b229-18ff0e56ec7d" (UID: "36c24e40-cfed-49b8-b229-18ff0e56ec7d"). InnerVolumeSpecName "pvc-13e5a416-992c-4969-a5af-8edfee1a0676". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.969249 4794 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.969294 4794 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.969303 4794 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-web-config\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.969311 4794 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36c24e40-cfed-49b8-b229-18ff0e56ec7d-config-out\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.969322 4794 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/36c24e40-cfed-49b8-b229-18ff0e56ec7d-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.969367 4794 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-13e5a416-992c-4969-a5af-8edfee1a0676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676\") on node \"crc\" " Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.969381 4794 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/36c24e40-cfed-49b8-b229-18ff0e56ec7d-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.969392 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5scwt\" (UniqueName: \"kubernetes.io/projected/36c24e40-cfed-49b8-b229-18ff0e56ec7d-kube-api-access-5scwt\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.989464 4794 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 15 14:14:18 crc kubenswrapper[4794]: I1215 14:14:18.989669 4794 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-13e5a416-992c-4969-a5af-8edfee1a0676" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676") on node "crc" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.070244 4794 reconciler_common.go:293] "Volume detached for volume \"pvc-13e5a416-992c-4969-a5af-8edfee1a0676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.317348 4794 generic.go:334] "Generic (PLEG): container finished" podID="dd2dc9c2-0d41-4fa9-9104-7fe7854a0989" containerID="b81824a9af090577ec689b82fae2bf93e6b502bcd6a63a3540b357ef464e1ce5" exitCode=0 Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.317437 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7b39-account-create-l4qsh" event={"ID":"dd2dc9c2-0d41-4fa9-9104-7fe7854a0989","Type":"ContainerDied","Data":"b81824a9af090577ec689b82fae2bf93e6b502bcd6a63a3540b357ef464e1ce5"} Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.317471 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7b39-account-create-l4qsh" event={"ID":"dd2dc9c2-0d41-4fa9-9104-7fe7854a0989","Type":"ContainerStarted","Data":"a2437d02a384990962aff046d46566ad3c7b0540217964785c3436d7bf6ce928"} Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.319112 4794 generic.go:334] "Generic (PLEG): container finished" podID="823e3288-4f23-430b-843e-50f2e4230b46" containerID="f7af352a50ef4278a864ae7e74e4f44c01f5e9e51d095a2630a24308082abb8f" exitCode=0 Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.319176 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"823e3288-4f23-430b-843e-50f2e4230b46","Type":"ContainerDied","Data":"f7af352a50ef4278a864ae7e74e4f44c01f5e9e51d095a2630a24308082abb8f"} Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.330926 4794 generic.go:334] "Generic (PLEG): container finished" podID="2eef62de-2115-49c1-bb86-8526606f7a69" containerID="22e531388ecd699a369fca04188e02da631cbd9fed9dff3fd4978b87f5d961fb" exitCode=0 Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.331016 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"2eef62de-2115-49c1-bb86-8526606f7a69","Type":"ContainerDied","Data":"22e531388ecd699a369fca04188e02da631cbd9fed9dff3fd4978b87f5d961fb"} Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.341023 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"36c24e40-cfed-49b8-b229-18ff0e56ec7d","Type":"ContainerDied","Data":"1ef4f16b7df0a56e525a048d1362d193242fd4c736dc2f8259b59d6e3ff43387"} Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.341069 4794 scope.go:117] "RemoveContainer" containerID="16814d37a6be92ec52ca02e97b11bc0a7f62de40f6864c67f9167b942df945c0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.341170 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.455167 4794 scope.go:117] "RemoveContainer" containerID="a8e7510495ca04e1c3076de490d83e4cc867775a859b41fc5b106055a575d4d8" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.468955 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.490235 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.495802 4794 scope.go:117] "RemoveContainer" containerID="61a2a5e2b2754b4101b33b35b87e711c3fb9a9dbcf8d8c8a6629a2fd37cd81d2" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.507724 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 15 14:14:19 crc kubenswrapper[4794]: E1215 14:14:19.508122 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="prometheus" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.508141 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="prometheus" Dec 15 14:14:19 crc kubenswrapper[4794]: E1215 14:14:19.508169 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="init-config-reloader" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.508177 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="init-config-reloader" Dec 15 14:14:19 crc kubenswrapper[4794]: E1215 14:14:19.508189 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="config-reloader" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.508197 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="config-reloader" Dec 15 14:14:19 crc kubenswrapper[4794]: E1215 14:14:19.508207 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="thanos-sidecar" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.508214 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="thanos-sidecar" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.508392 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="config-reloader" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.508414 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="prometheus" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.508430 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" containerName="thanos-sidecar" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.510132 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.522777 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.524894 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-crnq2" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.525094 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.525214 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.525364 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-metric-storage-prometheus-svc" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.525688 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.542673 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.552887 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.585209 4794 scope.go:117] "RemoveContainer" containerID="50ab02348f74c3023acd9e6d2c642bc3a7552ac621494f462ab409ab68cccabb" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.687614 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-13e5a416-992c-4969-a5af-8edfee1a0676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.687660 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5682q\" (UniqueName: \"kubernetes.io/projected/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-kube-api-access-5682q\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.687683 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.687714 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.687733 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.687756 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.687781 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.687814 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.687833 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-config\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.687848 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.687868 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.788928 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.789265 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.789287 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.789318 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.789350 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.789385 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.789403 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-config\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.789418 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.789439 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.789487 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-13e5a416-992c-4969-a5af-8edfee1a0676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.789510 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5682q\" (UniqueName: \"kubernetes.io/projected/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-kube-api-access-5682q\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.794707 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-config\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.796237 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.796909 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.797199 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.797230 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.804429 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.804880 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.804945 4794 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.804965 4794 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-13e5a416-992c-4969-a5af-8edfee1a0676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/28a45d0bd3bc303d2e5ce6038c15d61aa6d12bcb125364e266544ebd0e5c9e47/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.806071 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.806481 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.809808 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5682q\" (UniqueName: \"kubernetes.io/projected/b356cb49-9fcf-4e1b-8a40-ed69d1418acb-kube-api-access-5682q\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:19 crc kubenswrapper[4794]: I1215 14:14:19.857211 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-13e5a416-992c-4969-a5af-8edfee1a0676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13e5a416-992c-4969-a5af-8edfee1a0676\") pod \"prometheus-metric-storage-0\" (UID: \"b356cb49-9fcf-4e1b-8a40-ed69d1418acb\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.140656 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.353018 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"2eef62de-2115-49c1-bb86-8526606f7a69","Type":"ContainerStarted","Data":"27de316cca852137018634556fd5ef60f1d3f3757b717bd7978837d606b671c6"} Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.354370 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.357683 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"823e3288-4f23-430b-843e-50f2e4230b46","Type":"ContainerStarted","Data":"fce291c49f450f35af69754e121cd6e97bb21c3fcaefee14b078509afc8de58f"} Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.358024 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.378881 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podStartSLOduration=39.041308349 podStartE2EDuration="56.378865401s" podCreationTimestamp="2025-12-15 14:13:24 +0000 UTC" firstStartedPulling="2025-12-15 14:13:26.332948995 +0000 UTC m=+1168.184971433" lastFinishedPulling="2025-12-15 14:13:43.670506037 +0000 UTC m=+1185.522528485" observedRunningTime="2025-12-15 14:14:20.377286518 +0000 UTC m=+1222.229308996" watchObservedRunningTime="2025-12-15 14:14:20.378865401 +0000 UTC m=+1222.230887839" Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.411691 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-server-0" podStartSLOduration=39.30401714 podStartE2EDuration="57.411673073s" podCreationTimestamp="2025-12-15 14:13:23 +0000 UTC" firstStartedPulling="2025-12-15 14:13:25.562994918 +0000 UTC m=+1167.415017386" lastFinishedPulling="2025-12-15 14:13:43.670650851 +0000 UTC m=+1185.522673319" observedRunningTime="2025-12-15 14:14:20.409562586 +0000 UTC m=+1222.261585044" watchObservedRunningTime="2025-12-15 14:14:20.411673073 +0000 UTC m=+1222.263695531" Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.723607 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7b39-account-create-l4qsh" Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.750155 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c24e40-cfed-49b8-b229-18ff0e56ec7d" path="/var/lib/kubelet/pods/36c24e40-cfed-49b8-b229-18ff0e56ec7d/volumes" Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.804043 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp2lf\" (UniqueName: \"kubernetes.io/projected/dd2dc9c2-0d41-4fa9-9104-7fe7854a0989-kube-api-access-lp2lf\") pod \"dd2dc9c2-0d41-4fa9-9104-7fe7854a0989\" (UID: \"dd2dc9c2-0d41-4fa9-9104-7fe7854a0989\") " Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.808443 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2dc9c2-0d41-4fa9-9104-7fe7854a0989-kube-api-access-lp2lf" (OuterVolumeSpecName: "kube-api-access-lp2lf") pod "dd2dc9c2-0d41-4fa9-9104-7fe7854a0989" (UID: "dd2dc9c2-0d41-4fa9-9104-7fe7854a0989"). InnerVolumeSpecName "kube-api-access-lp2lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:14:20 crc kubenswrapper[4794]: I1215 14:14:20.905756 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp2lf\" (UniqueName: \"kubernetes.io/projected/dd2dc9c2-0d41-4fa9-9104-7fe7854a0989-kube-api-access-lp2lf\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:21 crc kubenswrapper[4794]: I1215 14:14:21.158242 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Dec 15 14:14:21 crc kubenswrapper[4794]: I1215 14:14:21.368964 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"b356cb49-9fcf-4e1b-8a40-ed69d1418acb","Type":"ContainerStarted","Data":"e6c4dd4be6748f27af3098b58adc501ebe7bbd3eadbe616b358b239ce391246e"} Dec 15 14:14:21 crc kubenswrapper[4794]: I1215 14:14:21.370861 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7b39-account-create-l4qsh" event={"ID":"dd2dc9c2-0d41-4fa9-9104-7fe7854a0989","Type":"ContainerDied","Data":"a2437d02a384990962aff046d46566ad3c7b0540217964785c3436d7bf6ce928"} Dec 15 14:14:21 crc kubenswrapper[4794]: I1215 14:14:21.370927 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2437d02a384990962aff046d46566ad3c7b0540217964785c3436d7bf6ce928" Dec 15 14:14:21 crc kubenswrapper[4794]: I1215 14:14:21.371268 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7b39-account-create-l4qsh" Dec 15 14:14:24 crc kubenswrapper[4794]: I1215 14:14:24.397831 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"b356cb49-9fcf-4e1b-8a40-ed69d1418acb","Type":"ContainerStarted","Data":"cc55c98f9646d8daa33ea5b0dfdd616957b63c64bfb9c3c3554bae848332dfc6"} Dec 15 14:14:31 crc kubenswrapper[4794]: I1215 14:14:31.470309 4794 generic.go:334] "Generic (PLEG): container finished" podID="b356cb49-9fcf-4e1b-8a40-ed69d1418acb" containerID="cc55c98f9646d8daa33ea5b0dfdd616957b63c64bfb9c3c3554bae848332dfc6" exitCode=0 Dec 15 14:14:31 crc kubenswrapper[4794]: I1215 14:14:31.470409 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"b356cb49-9fcf-4e1b-8a40-ed69d1418acb","Type":"ContainerDied","Data":"cc55c98f9646d8daa33ea5b0dfdd616957b63c64bfb9c3c3554bae848332dfc6"} Dec 15 14:14:32 crc kubenswrapper[4794]: I1215 14:14:32.486162 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"b356cb49-9fcf-4e1b-8a40-ed69d1418acb","Type":"ContainerStarted","Data":"b395554c9f37770eca451139b34f43c5d188330ad9cc804fe543b269f2e22a03"} Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.115734 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-server-0" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.541527 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"b356cb49-9fcf-4e1b-8a40-ed69d1418acb","Type":"ContainerStarted","Data":"aae580139a43cc45836dfe2b642ffd936b5a9a9c509a760339a618922c2f6574"} Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.541668 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"b356cb49-9fcf-4e1b-8a40-ed69d1418acb","Type":"ContainerStarted","Data":"493c59193147c6501ee7a538c617fd64a495c689d8afc6fd437e299fac17c76a"} Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.571851 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=16.571828231 podStartE2EDuration="16.571828231s" podCreationTimestamp="2025-12-15 14:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:14:35.566071988 +0000 UTC m=+1237.418094436" watchObservedRunningTime="2025-12-15 14:14:35.571828231 +0000 UTC m=+1237.423850679" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.730654 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-sync-kr9xn"] Dec 15 14:14:35 crc kubenswrapper[4794]: E1215 14:14:35.731315 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2dc9c2-0d41-4fa9-9104-7fe7854a0989" containerName="mariadb-account-create" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.731334 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2dc9c2-0d41-4fa9-9104-7fe7854a0989" containerName="mariadb-account-create" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.731571 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2dc9c2-0d41-4fa9-9104-7fe7854a0989" containerName="mariadb-account-create" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.732231 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.734672 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.735335 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-5gfn4" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.735541 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.736213 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.752155 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-kr9xn"] Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.768422 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-config-data\") pod \"keystone-db-sync-kr9xn\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.768497 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrxld\" (UniqueName: \"kubernetes.io/projected/9a1d66da-db28-48e2-82c9-bc3730fc1934-kube-api-access-wrxld\") pod \"keystone-db-sync-kr9xn\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.768564 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-combined-ca-bundle\") pod \"keystone-db-sync-kr9xn\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.870429 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-config-data\") pod \"keystone-db-sync-kr9xn\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.870495 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrxld\" (UniqueName: \"kubernetes.io/projected/9a1d66da-db28-48e2-82c9-bc3730fc1934-kube-api-access-wrxld\") pod \"keystone-db-sync-kr9xn\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.870570 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-combined-ca-bundle\") pod \"keystone-db-sync-kr9xn\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.884441 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-config-data\") pod \"keystone-db-sync-kr9xn\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.885617 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-combined-ca-bundle\") pod \"keystone-db-sync-kr9xn\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:35 crc kubenswrapper[4794]: I1215 14:14:35.889090 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrxld\" (UniqueName: \"kubernetes.io/projected/9a1d66da-db28-48e2-82c9-bc3730fc1934-kube-api-access-wrxld\") pod \"keystone-db-sync-kr9xn\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:36 crc kubenswrapper[4794]: I1215 14:14:36.047500 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:36 crc kubenswrapper[4794]: I1215 14:14:36.087942 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Dec 15 14:14:36 crc kubenswrapper[4794]: I1215 14:14:36.598617 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-kr9xn"] Dec 15 14:14:37 crc kubenswrapper[4794]: I1215 14:14:37.563757 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-kr9xn" event={"ID":"9a1d66da-db28-48e2-82c9-bc3730fc1934","Type":"ContainerStarted","Data":"245b2e2a627a36705c89cc4bbbd3375c3b72d6fbcd08ec13c03ec251bd6d1667"} Dec 15 14:14:40 crc kubenswrapper[4794]: I1215 14:14:40.141432 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:45 crc kubenswrapper[4794]: I1215 14:14:45.665266 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-kr9xn" event={"ID":"9a1d66da-db28-48e2-82c9-bc3730fc1934","Type":"ContainerStarted","Data":"742dd0788cdf0e6b03e619a41ed224af76564bbac9ae89340c5eae0c36f821cf"} Dec 15 14:14:45 crc kubenswrapper[4794]: I1215 14:14:45.712618 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-db-sync-kr9xn" podStartSLOduration=2.578431998 podStartE2EDuration="10.712570831s" podCreationTimestamp="2025-12-15 14:14:35 +0000 UTC" firstStartedPulling="2025-12-15 14:14:36.603623343 +0000 UTC m=+1238.455645781" lastFinishedPulling="2025-12-15 14:14:44.737762136 +0000 UTC m=+1246.589784614" observedRunningTime="2025-12-15 14:14:45.705672456 +0000 UTC m=+1247.557694904" watchObservedRunningTime="2025-12-15 14:14:45.712570831 +0000 UTC m=+1247.564593309" Dec 15 14:14:47 crc kubenswrapper[4794]: I1215 14:14:47.681743 4794 generic.go:334] "Generic (PLEG): container finished" podID="9a1d66da-db28-48e2-82c9-bc3730fc1934" containerID="742dd0788cdf0e6b03e619a41ed224af76564bbac9ae89340c5eae0c36f821cf" exitCode=0 Dec 15 14:14:47 crc kubenswrapper[4794]: I1215 14:14:47.681879 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-kr9xn" event={"ID":"9a1d66da-db28-48e2-82c9-bc3730fc1934","Type":"ContainerDied","Data":"742dd0788cdf0e6b03e619a41ed224af76564bbac9ae89340c5eae0c36f821cf"} Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.098895 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.290422 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-config-data\") pod \"9a1d66da-db28-48e2-82c9-bc3730fc1934\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.290518 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrxld\" (UniqueName: \"kubernetes.io/projected/9a1d66da-db28-48e2-82c9-bc3730fc1934-kube-api-access-wrxld\") pod \"9a1d66da-db28-48e2-82c9-bc3730fc1934\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.290564 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-combined-ca-bundle\") pod \"9a1d66da-db28-48e2-82c9-bc3730fc1934\" (UID: \"9a1d66da-db28-48e2-82c9-bc3730fc1934\") " Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.299385 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a1d66da-db28-48e2-82c9-bc3730fc1934-kube-api-access-wrxld" (OuterVolumeSpecName: "kube-api-access-wrxld") pod "9a1d66da-db28-48e2-82c9-bc3730fc1934" (UID: "9a1d66da-db28-48e2-82c9-bc3730fc1934"). InnerVolumeSpecName "kube-api-access-wrxld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.339646 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a1d66da-db28-48e2-82c9-bc3730fc1934" (UID: "9a1d66da-db28-48e2-82c9-bc3730fc1934"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.376315 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-config-data" (OuterVolumeSpecName: "config-data") pod "9a1d66da-db28-48e2-82c9-bc3730fc1934" (UID: "9a1d66da-db28-48e2-82c9-bc3730fc1934"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.393723 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.393777 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrxld\" (UniqueName: \"kubernetes.io/projected/9a1d66da-db28-48e2-82c9-bc3730fc1934-kube-api-access-wrxld\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.393793 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a1d66da-db28-48e2-82c9-bc3730fc1934-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.701409 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-kr9xn" Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.701414 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-kr9xn" event={"ID":"9a1d66da-db28-48e2-82c9-bc3730fc1934","Type":"ContainerDied","Data":"245b2e2a627a36705c89cc4bbbd3375c3b72d6fbcd08ec13c03ec251bd6d1667"} Dec 15 14:14:49 crc kubenswrapper[4794]: I1215 14:14:49.701576 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="245b2e2a627a36705c89cc4bbbd3375c3b72d6fbcd08ec13c03ec251bd6d1667" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.141760 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.150997 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.319073 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-dmdct"] Dec 15 14:14:50 crc kubenswrapper[4794]: E1215 14:14:50.319427 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a1d66da-db28-48e2-82c9-bc3730fc1934" containerName="keystone-db-sync" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.319446 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a1d66da-db28-48e2-82c9-bc3730fc1934" containerName="keystone-db-sync" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.319628 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a1d66da-db28-48e2-82c9-bc3730fc1934" containerName="keystone-db-sync" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.320200 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.324565 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.324565 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.324857 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.324983 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-5gfn4" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.344456 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-dmdct"] Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.413676 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6sp8\" (UniqueName: \"kubernetes.io/projected/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-kube-api-access-z6sp8\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.413765 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-config-data\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.413828 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-credential-keys\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.413862 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-combined-ca-bundle\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.413885 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-scripts\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.413966 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-fernet-keys\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.515573 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-fernet-keys\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.515918 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6sp8\" (UniqueName: \"kubernetes.io/projected/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-kube-api-access-z6sp8\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.515950 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-config-data\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.515996 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-credential-keys\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.516017 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-combined-ca-bundle\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.516035 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-scripts\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.520705 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-fernet-keys\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.520723 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-combined-ca-bundle\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.522452 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-config-data\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.522687 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-scripts\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.523149 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-credential-keys\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.523495 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.536721 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.539976 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.540031 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6sp8\" (UniqueName: \"kubernetes.io/projected/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-kube-api-access-z6sp8\") pod \"keystone-bootstrap-dmdct\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.540180 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.548720 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.636982 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.715213 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.719382 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-config-data\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.719424 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.719489 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.719511 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-scripts\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.719531 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-run-httpd\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.719558 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j9b7\" (UniqueName: \"kubernetes.io/projected/4784c05a-a7ce-41c3-8066-123ad1404ddb-kube-api-access-8j9b7\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.719590 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-log-httpd\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.820912 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-scripts\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.820964 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-run-httpd\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.821007 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j9b7\" (UniqueName: \"kubernetes.io/projected/4784c05a-a7ce-41c3-8066-123ad1404ddb-kube-api-access-8j9b7\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.821031 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-log-httpd\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.821135 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-config-data\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.821151 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.821207 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.821983 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-run-httpd\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.823012 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-log-httpd\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.824849 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-scripts\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.827572 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.829856 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-config-data\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.835115 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.857305 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j9b7\" (UniqueName: \"kubernetes.io/projected/4784c05a-a7ce-41c3-8066-123ad1404ddb-kube-api-access-8j9b7\") pod \"ceilometer-0\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.921315 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:14:50 crc kubenswrapper[4794]: I1215 14:14:50.933725 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-dmdct"] Dec 15 14:14:51 crc kubenswrapper[4794]: W1215 14:14:51.409180 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4784c05a_a7ce_41c3_8066_123ad1404ddb.slice/crio-552b3882516f50773ebc166e8ce9c293f03f27735e18fd586c837f0bc84c8bbe WatchSource:0}: Error finding container 552b3882516f50773ebc166e8ce9c293f03f27735e18fd586c837f0bc84c8bbe: Status 404 returned error can't find the container with id 552b3882516f50773ebc166e8ce9c293f03f27735e18fd586c837f0bc84c8bbe Dec 15 14:14:51 crc kubenswrapper[4794]: I1215 14:14:51.414677 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:14:51 crc kubenswrapper[4794]: I1215 14:14:51.717851 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4784c05a-a7ce-41c3-8066-123ad1404ddb","Type":"ContainerStarted","Data":"552b3882516f50773ebc166e8ce9c293f03f27735e18fd586c837f0bc84c8bbe"} Dec 15 14:14:51 crc kubenswrapper[4794]: I1215 14:14:51.719047 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-dmdct" event={"ID":"49edbc3f-4712-4ac1-bb5c-06416efa5f8a","Type":"ContainerStarted","Data":"b1804910c08a77f0396eb77684680cd80717d3355d0650f3278a159f48afdaa9"} Dec 15 14:14:51 crc kubenswrapper[4794]: I1215 14:14:51.719071 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-dmdct" event={"ID":"49edbc3f-4712-4ac1-bb5c-06416efa5f8a","Type":"ContainerStarted","Data":"7eabf6a77636eed7ff4f4d4c9a08acafa461520b6dcf5d63afeb8afeec2f23c1"} Dec 15 14:14:51 crc kubenswrapper[4794]: I1215 14:14:51.738251 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-dmdct" podStartSLOduration=1.738233879 podStartE2EDuration="1.738233879s" podCreationTimestamp="2025-12-15 14:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:14:51.733527506 +0000 UTC m=+1253.585549964" watchObservedRunningTime="2025-12-15 14:14:51.738233879 +0000 UTC m=+1253.590256327" Dec 15 14:14:52 crc kubenswrapper[4794]: I1215 14:14:52.263843 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:14:54 crc kubenswrapper[4794]: I1215 14:14:54.760771 4794 generic.go:334] "Generic (PLEG): container finished" podID="49edbc3f-4712-4ac1-bb5c-06416efa5f8a" containerID="b1804910c08a77f0396eb77684680cd80717d3355d0650f3278a159f48afdaa9" exitCode=0 Dec 15 14:14:54 crc kubenswrapper[4794]: I1215 14:14:54.760859 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-dmdct" event={"ID":"49edbc3f-4712-4ac1-bb5c-06416efa5f8a","Type":"ContainerDied","Data":"b1804910c08a77f0396eb77684680cd80717d3355d0650f3278a159f48afdaa9"} Dec 15 14:14:55 crc kubenswrapper[4794]: I1215 14:14:55.771449 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4784c05a-a7ce-41c3-8066-123ad1404ddb","Type":"ContainerStarted","Data":"5d82e8f49a5cc89b0f10bbc792ee525398f730ebf7dc3da23b9243c7143ab5ab"} Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.143680 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.306564 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6sp8\" (UniqueName: \"kubernetes.io/projected/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-kube-api-access-z6sp8\") pod \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.307508 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-combined-ca-bundle\") pod \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.307561 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-fernet-keys\") pod \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.307631 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-credential-keys\") pod \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.307669 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-scripts\") pod \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.307723 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-config-data\") pod \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\" (UID: \"49edbc3f-4712-4ac1-bb5c-06416efa5f8a\") " Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.313338 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "49edbc3f-4712-4ac1-bb5c-06416efa5f8a" (UID: "49edbc3f-4712-4ac1-bb5c-06416efa5f8a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.313379 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-scripts" (OuterVolumeSpecName: "scripts") pod "49edbc3f-4712-4ac1-bb5c-06416efa5f8a" (UID: "49edbc3f-4712-4ac1-bb5c-06416efa5f8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.314744 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-kube-api-access-z6sp8" (OuterVolumeSpecName: "kube-api-access-z6sp8") pod "49edbc3f-4712-4ac1-bb5c-06416efa5f8a" (UID: "49edbc3f-4712-4ac1-bb5c-06416efa5f8a"). InnerVolumeSpecName "kube-api-access-z6sp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.316719 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "49edbc3f-4712-4ac1-bb5c-06416efa5f8a" (UID: "49edbc3f-4712-4ac1-bb5c-06416efa5f8a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.337797 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-config-data" (OuterVolumeSpecName: "config-data") pod "49edbc3f-4712-4ac1-bb5c-06416efa5f8a" (UID: "49edbc3f-4712-4ac1-bb5c-06416efa5f8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.345005 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49edbc3f-4712-4ac1-bb5c-06416efa5f8a" (UID: "49edbc3f-4712-4ac1-bb5c-06416efa5f8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.409996 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.410034 4794 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.410052 4794 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.410064 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.410088 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.410101 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6sp8\" (UniqueName: \"kubernetes.io/projected/49edbc3f-4712-4ac1-bb5c-06416efa5f8a-kube-api-access-z6sp8\") on node \"crc\" DevicePath \"\"" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.780876 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-dmdct" event={"ID":"49edbc3f-4712-4ac1-bb5c-06416efa5f8a","Type":"ContainerDied","Data":"7eabf6a77636eed7ff4f4d4c9a08acafa461520b6dcf5d63afeb8afeec2f23c1"} Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.780909 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eabf6a77636eed7ff4f4d4c9a08acafa461520b6dcf5d63afeb8afeec2f23c1" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.780945 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-dmdct" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.838463 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-dmdct"] Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.854423 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-dmdct"] Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.937804 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-cnrfp"] Dec 15 14:14:56 crc kubenswrapper[4794]: E1215 14:14:56.938445 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49edbc3f-4712-4ac1-bb5c-06416efa5f8a" containerName="keystone-bootstrap" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.938533 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="49edbc3f-4712-4ac1-bb5c-06416efa5f8a" containerName="keystone-bootstrap" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.938771 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="49edbc3f-4712-4ac1-bb5c-06416efa5f8a" containerName="keystone-bootstrap" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.939435 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.941831 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.945219 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.945408 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-5gfn4" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.950731 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 15 14:14:56 crc kubenswrapper[4794]: I1215 14:14:56.951558 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-cnrfp"] Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.020363 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-combined-ca-bundle\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.020412 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-fernet-keys\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.020498 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqmmz\" (UniqueName: \"kubernetes.io/projected/df16e0e7-1842-4fef-9827-42ddbe237256-kube-api-access-wqmmz\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.020536 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-scripts\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.020705 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-config-data\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.020886 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-credential-keys\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.121940 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-combined-ca-bundle\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.121980 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-fernet-keys\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.122064 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqmmz\" (UniqueName: \"kubernetes.io/projected/df16e0e7-1842-4fef-9827-42ddbe237256-kube-api-access-wqmmz\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.122094 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-scripts\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.122561 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-config-data\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.122618 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-credential-keys\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.127863 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-scripts\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.128241 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-combined-ca-bundle\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.128327 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-config-data\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.128698 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-fernet-keys\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.140441 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-credential-keys\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.148040 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqmmz\" (UniqueName: \"kubernetes.io/projected/df16e0e7-1842-4fef-9827-42ddbe237256-kube-api-access-wqmmz\") pod \"keystone-bootstrap-cnrfp\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.310996 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.791293 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4784c05a-a7ce-41c3-8066-123ad1404ddb","Type":"ContainerStarted","Data":"539b297155d5db9b3926f69b6f8d22b11c2a5b6e8b366c88c34f31ec9a77a147"} Dec 15 14:14:57 crc kubenswrapper[4794]: I1215 14:14:57.812571 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-cnrfp"] Dec 15 14:14:58 crc kubenswrapper[4794]: I1215 14:14:58.748704 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49edbc3f-4712-4ac1-bb5c-06416efa5f8a" path="/var/lib/kubelet/pods/49edbc3f-4712-4ac1-bb5c-06416efa5f8a/volumes" Dec 15 14:14:58 crc kubenswrapper[4794]: I1215 14:14:58.805995 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" event={"ID":"df16e0e7-1842-4fef-9827-42ddbe237256","Type":"ContainerStarted","Data":"8bc2791eac3f32df055f689898e397c05cb4f0af73069c0da511781cd3f9d0f7"} Dec 15 14:14:58 crc kubenswrapper[4794]: I1215 14:14:58.808334 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" event={"ID":"df16e0e7-1842-4fef-9827-42ddbe237256","Type":"ContainerStarted","Data":"798ed49c998e6bb736a873df3cce163bae39bed8eb387c3e3dc7b02bdd8944aa"} Dec 15 14:14:58 crc kubenswrapper[4794]: I1215 14:14:58.824192 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" podStartSLOduration=2.824173775 podStartE2EDuration="2.824173775s" podCreationTimestamp="2025-12-15 14:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:14:58.823303791 +0000 UTC m=+1260.675326249" watchObservedRunningTime="2025-12-15 14:14:58.824173775 +0000 UTC m=+1260.676196213" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.142509 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk"] Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.144366 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.149566 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk"] Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.154434 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.154829 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.301418 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-config-volume\") pod \"collect-profiles-29430135-vxtgk\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.301493 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-secret-volume\") pod \"collect-profiles-29430135-vxtgk\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.301854 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4gl5\" (UniqueName: \"kubernetes.io/projected/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-kube-api-access-f4gl5\") pod \"collect-profiles-29430135-vxtgk\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.403916 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-secret-volume\") pod \"collect-profiles-29430135-vxtgk\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.404223 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4gl5\" (UniqueName: \"kubernetes.io/projected/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-kube-api-access-f4gl5\") pod \"collect-profiles-29430135-vxtgk\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.404309 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-config-volume\") pod \"collect-profiles-29430135-vxtgk\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.405962 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-config-volume\") pod \"collect-profiles-29430135-vxtgk\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.418720 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-secret-volume\") pod \"collect-profiles-29430135-vxtgk\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.427869 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4gl5\" (UniqueName: \"kubernetes.io/projected/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-kube-api-access-f4gl5\") pod \"collect-profiles-29430135-vxtgk\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.461924 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:00 crc kubenswrapper[4794]: I1215 14:15:00.910631 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk"] Dec 15 14:15:01 crc kubenswrapper[4794]: I1215 14:15:01.854940 4794 generic.go:334] "Generic (PLEG): container finished" podID="df16e0e7-1842-4fef-9827-42ddbe237256" containerID="8bc2791eac3f32df055f689898e397c05cb4f0af73069c0da511781cd3f9d0f7" exitCode=0 Dec 15 14:15:01 crc kubenswrapper[4794]: I1215 14:15:01.854983 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" event={"ID":"df16e0e7-1842-4fef-9827-42ddbe237256","Type":"ContainerDied","Data":"8bc2791eac3f32df055f689898e397c05cb4f0af73069c0da511781cd3f9d0f7"} Dec 15 14:15:05 crc kubenswrapper[4794]: I1215 14:15:05.890625 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" event={"ID":"e435c3d6-d3cd-45a5-b4f9-d40a1b963208","Type":"ContainerStarted","Data":"bceb99edf5008989f4d35c333aefbbb43ad52747eb30e2c7986b964b40b6b5b6"} Dec 15 14:15:05 crc kubenswrapper[4794]: I1215 14:15:05.892228 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" event={"ID":"df16e0e7-1842-4fef-9827-42ddbe237256","Type":"ContainerDied","Data":"798ed49c998e6bb736a873df3cce163bae39bed8eb387c3e3dc7b02bdd8944aa"} Dec 15 14:15:05 crc kubenswrapper[4794]: I1215 14:15:05.892261 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="798ed49c998e6bb736a873df3cce163bae39bed8eb387c3e3dc7b02bdd8944aa" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.202090 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.261840 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-scripts\") pod \"df16e0e7-1842-4fef-9827-42ddbe237256\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.261918 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-credential-keys\") pod \"df16e0e7-1842-4fef-9827-42ddbe237256\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.261970 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqmmz\" (UniqueName: \"kubernetes.io/projected/df16e0e7-1842-4fef-9827-42ddbe237256-kube-api-access-wqmmz\") pod \"df16e0e7-1842-4fef-9827-42ddbe237256\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.262053 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-combined-ca-bundle\") pod \"df16e0e7-1842-4fef-9827-42ddbe237256\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.262107 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-fernet-keys\") pod \"df16e0e7-1842-4fef-9827-42ddbe237256\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.262131 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-config-data\") pod \"df16e0e7-1842-4fef-9827-42ddbe237256\" (UID: \"df16e0e7-1842-4fef-9827-42ddbe237256\") " Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.268777 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "df16e0e7-1842-4fef-9827-42ddbe237256" (UID: "df16e0e7-1842-4fef-9827-42ddbe237256"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.269403 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df16e0e7-1842-4fef-9827-42ddbe237256-kube-api-access-wqmmz" (OuterVolumeSpecName: "kube-api-access-wqmmz") pod "df16e0e7-1842-4fef-9827-42ddbe237256" (UID: "df16e0e7-1842-4fef-9827-42ddbe237256"). InnerVolumeSpecName "kube-api-access-wqmmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.271573 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "df16e0e7-1842-4fef-9827-42ddbe237256" (UID: "df16e0e7-1842-4fef-9827-42ddbe237256"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.278721 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-scripts" (OuterVolumeSpecName: "scripts") pod "df16e0e7-1842-4fef-9827-42ddbe237256" (UID: "df16e0e7-1842-4fef-9827-42ddbe237256"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.298918 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df16e0e7-1842-4fef-9827-42ddbe237256" (UID: "df16e0e7-1842-4fef-9827-42ddbe237256"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.322783 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-config-data" (OuterVolumeSpecName: "config-data") pod "df16e0e7-1842-4fef-9827-42ddbe237256" (UID: "df16e0e7-1842-4fef-9827-42ddbe237256"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.363738 4794 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.363783 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqmmz\" (UniqueName: \"kubernetes.io/projected/df16e0e7-1842-4fef-9827-42ddbe237256-kube-api-access-wqmmz\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.363795 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.363805 4794 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.363813 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.363836 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df16e0e7-1842-4fef-9827-42ddbe237256-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.908005 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4784c05a-a7ce-41c3-8066-123ad1404ddb","Type":"ContainerStarted","Data":"236b004f1464247e283d1f285952f0752cc36801abf4295c46cb8a6f728391f1"} Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.910863 4794 generic.go:334] "Generic (PLEG): container finished" podID="e435c3d6-d3cd-45a5-b4f9-d40a1b963208" containerID="7c2aecafc94e2e4378da387d7f7181a8852de886fbce251b5c6181bb04acdd30" exitCode=0 Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.910970 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-cnrfp" Dec 15 14:15:06 crc kubenswrapper[4794]: I1215 14:15:06.912262 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" event={"ID":"e435c3d6-d3cd-45a5-b4f9-d40a1b963208","Type":"ContainerDied","Data":"7c2aecafc94e2e4378da387d7f7181a8852de886fbce251b5c6181bb04acdd30"} Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.417401 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t"] Dec 15 14:15:07 crc kubenswrapper[4794]: E1215 14:15:07.418025 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df16e0e7-1842-4fef-9827-42ddbe237256" containerName="keystone-bootstrap" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.418038 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="df16e0e7-1842-4fef-9827-42ddbe237256" containerName="keystone-bootstrap" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.418198 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="df16e0e7-1842-4fef-9827-42ddbe237256" containerName="keystone-bootstrap" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.418751 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.474347 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.475907 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-5gfn4" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.476018 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-public-svc" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.476241 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-internal-svc" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.477463 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.479458 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.480901 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-combined-ca-bundle\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.481170 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-public-tls-certs\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.481270 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-credential-keys\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.481366 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-internal-tls-certs\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.481460 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-scripts\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.481552 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrnx4\" (UniqueName: \"kubernetes.io/projected/05371ef4-81ff-4685-a82b-98d71c8bf9cb-kube-api-access-nrnx4\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.481667 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-config-data\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.482396 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-fernet-keys\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.481013 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t"] Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.583799 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrnx4\" (UniqueName: \"kubernetes.io/projected/05371ef4-81ff-4685-a82b-98d71c8bf9cb-kube-api-access-nrnx4\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.583853 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-config-data\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.583876 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-fernet-keys\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.583930 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-combined-ca-bundle\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.583973 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-public-tls-certs\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.584000 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-credential-keys\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.584022 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-internal-tls-certs\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.584041 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-scripts\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.592179 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-scripts\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.592271 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-public-tls-certs\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.592342 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-config-data\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.592660 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-credential-keys\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.593130 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-internal-tls-certs\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.593423 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-fernet-keys\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.599311 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrnx4\" (UniqueName: \"kubernetes.io/projected/05371ef4-81ff-4685-a82b-98d71c8bf9cb-kube-api-access-nrnx4\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.600116 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-combined-ca-bundle\") pod \"keystone-7dcdc8d5f6-mfh4t\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:07 crc kubenswrapper[4794]: I1215 14:15:07.786693 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.271892 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t"] Dec 15 14:15:08 crc kubenswrapper[4794]: W1215 14:15:08.292436 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05371ef4_81ff_4685_a82b_98d71c8bf9cb.slice/crio-8bdde47970f64c73f4ef2daf6a0567d5352a2bd8c2b7e958ff73191cce280465 WatchSource:0}: Error finding container 8bdde47970f64c73f4ef2daf6a0567d5352a2bd8c2b7e958ff73191cce280465: Status 404 returned error can't find the container with id 8bdde47970f64c73f4ef2daf6a0567d5352a2bd8c2b7e958ff73191cce280465 Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.395034 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.496289 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-config-volume\") pod \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.496394 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-secret-volume\") pod \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.496415 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4gl5\" (UniqueName: \"kubernetes.io/projected/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-kube-api-access-f4gl5\") pod \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\" (UID: \"e435c3d6-d3cd-45a5-b4f9-d40a1b963208\") " Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.497025 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-config-volume" (OuterVolumeSpecName: "config-volume") pod "e435c3d6-d3cd-45a5-b4f9-d40a1b963208" (UID: "e435c3d6-d3cd-45a5-b4f9-d40a1b963208"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.499511 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e435c3d6-d3cd-45a5-b4f9-d40a1b963208" (UID: "e435c3d6-d3cd-45a5-b4f9-d40a1b963208"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.499721 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-kube-api-access-f4gl5" (OuterVolumeSpecName: "kube-api-access-f4gl5") pod "e435c3d6-d3cd-45a5-b4f9-d40a1b963208" (UID: "e435c3d6-d3cd-45a5-b4f9-d40a1b963208"). InnerVolumeSpecName "kube-api-access-f4gl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.598362 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.598401 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4gl5\" (UniqueName: \"kubernetes.io/projected/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-kube-api-access-f4gl5\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.598414 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e435c3d6-d3cd-45a5-b4f9-d40a1b963208-config-volume\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.932693 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" event={"ID":"05371ef4-81ff-4685-a82b-98d71c8bf9cb","Type":"ContainerStarted","Data":"350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea"} Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.933072 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" event={"ID":"05371ef4-81ff-4685-a82b-98d71c8bf9cb","Type":"ContainerStarted","Data":"8bdde47970f64c73f4ef2daf6a0567d5352a2bd8c2b7e958ff73191cce280465"} Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.933709 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.936045 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" event={"ID":"e435c3d6-d3cd-45a5-b4f9-d40a1b963208","Type":"ContainerDied","Data":"bceb99edf5008989f4d35c333aefbbb43ad52747eb30e2c7986b964b40b6b5b6"} Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.936078 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430135-vxtgk" Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.936082 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bceb99edf5008989f4d35c333aefbbb43ad52747eb30e2c7986b964b40b6b5b6" Dec 15 14:15:08 crc kubenswrapper[4794]: I1215 14:15:08.953105 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" podStartSLOduration=1.9530910719999999 podStartE2EDuration="1.953091072s" podCreationTimestamp="2025-12-15 14:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:15:08.949166511 +0000 UTC m=+1270.801188939" watchObservedRunningTime="2025-12-15 14:15:08.953091072 +0000 UTC m=+1270.805113510" Dec 15 14:15:16 crc kubenswrapper[4794]: I1215 14:15:16.004325 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4784c05a-a7ce-41c3-8066-123ad1404ddb","Type":"ContainerStarted","Data":"1cfddd9e674d897c23ed707ec16c8425ff64dd574c3e42f5e519b980c4f6f345"} Dec 15 14:15:16 crc kubenswrapper[4794]: I1215 14:15:16.004972 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:16 crc kubenswrapper[4794]: I1215 14:15:16.004680 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="proxy-httpd" containerID="cri-o://1cfddd9e674d897c23ed707ec16c8425ff64dd574c3e42f5e519b980c4f6f345" gracePeriod=30 Dec 15 14:15:16 crc kubenswrapper[4794]: I1215 14:15:16.004572 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="ceilometer-central-agent" containerID="cri-o://5d82e8f49a5cc89b0f10bbc792ee525398f730ebf7dc3da23b9243c7143ab5ab" gracePeriod=30 Dec 15 14:15:16 crc kubenswrapper[4794]: I1215 14:15:16.004720 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="ceilometer-notification-agent" containerID="cri-o://539b297155d5db9b3926f69b6f8d22b11c2a5b6e8b366c88c34f31ec9a77a147" gracePeriod=30 Dec 15 14:15:16 crc kubenswrapper[4794]: I1215 14:15:16.004822 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="sg-core" containerID="cri-o://236b004f1464247e283d1f285952f0752cc36801abf4295c46cb8a6f728391f1" gracePeriod=30 Dec 15 14:15:16 crc kubenswrapper[4794]: I1215 14:15:16.062788 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.569810268 podStartE2EDuration="26.062761508s" podCreationTimestamp="2025-12-15 14:14:50 +0000 UTC" firstStartedPulling="2025-12-15 14:14:51.412039879 +0000 UTC m=+1253.264062317" lastFinishedPulling="2025-12-15 14:15:14.904991119 +0000 UTC m=+1276.757013557" observedRunningTime="2025-12-15 14:15:16.050234095 +0000 UTC m=+1277.902256573" watchObservedRunningTime="2025-12-15 14:15:16.062761508 +0000 UTC m=+1277.914783986" Dec 15 14:15:17 crc kubenswrapper[4794]: I1215 14:15:17.024709 4794 generic.go:334] "Generic (PLEG): container finished" podID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerID="1cfddd9e674d897c23ed707ec16c8425ff64dd574c3e42f5e519b980c4f6f345" exitCode=0 Dec 15 14:15:17 crc kubenswrapper[4794]: I1215 14:15:17.025226 4794 generic.go:334] "Generic (PLEG): container finished" podID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerID="236b004f1464247e283d1f285952f0752cc36801abf4295c46cb8a6f728391f1" exitCode=2 Dec 15 14:15:17 crc kubenswrapper[4794]: I1215 14:15:17.025251 4794 generic.go:334] "Generic (PLEG): container finished" podID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerID="5d82e8f49a5cc89b0f10bbc792ee525398f730ebf7dc3da23b9243c7143ab5ab" exitCode=0 Dec 15 14:15:17 crc kubenswrapper[4794]: I1215 14:15:17.024800 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4784c05a-a7ce-41c3-8066-123ad1404ddb","Type":"ContainerDied","Data":"1cfddd9e674d897c23ed707ec16c8425ff64dd574c3e42f5e519b980c4f6f345"} Dec 15 14:15:17 crc kubenswrapper[4794]: I1215 14:15:17.025319 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4784c05a-a7ce-41c3-8066-123ad1404ddb","Type":"ContainerDied","Data":"236b004f1464247e283d1f285952f0752cc36801abf4295c46cb8a6f728391f1"} Dec 15 14:15:17 crc kubenswrapper[4794]: I1215 14:15:17.025351 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4784c05a-a7ce-41c3-8066-123ad1404ddb","Type":"ContainerDied","Data":"5d82e8f49a5cc89b0f10bbc792ee525398f730ebf7dc3da23b9243c7143ab5ab"} Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.037117 4794 generic.go:334] "Generic (PLEG): container finished" podID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerID="539b297155d5db9b3926f69b6f8d22b11c2a5b6e8b366c88c34f31ec9a77a147" exitCode=0 Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.037195 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4784c05a-a7ce-41c3-8066-123ad1404ddb","Type":"ContainerDied","Data":"539b297155d5db9b3926f69b6f8d22b11c2a5b6e8b366c88c34f31ec9a77a147"} Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.243708 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.284735 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-sg-core-conf-yaml\") pod \"4784c05a-a7ce-41c3-8066-123ad1404ddb\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.284840 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-run-httpd\") pod \"4784c05a-a7ce-41c3-8066-123ad1404ddb\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.284914 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-config-data\") pod \"4784c05a-a7ce-41c3-8066-123ad1404ddb\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.284974 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-log-httpd\") pod \"4784c05a-a7ce-41c3-8066-123ad1404ddb\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.285044 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-combined-ca-bundle\") pod \"4784c05a-a7ce-41c3-8066-123ad1404ddb\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.285575 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j9b7\" (UniqueName: \"kubernetes.io/projected/4784c05a-a7ce-41c3-8066-123ad1404ddb-kube-api-access-8j9b7\") pod \"4784c05a-a7ce-41c3-8066-123ad1404ddb\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.285725 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-scripts\") pod \"4784c05a-a7ce-41c3-8066-123ad1404ddb\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.286832 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4784c05a-a7ce-41c3-8066-123ad1404ddb" (UID: "4784c05a-a7ce-41c3-8066-123ad1404ddb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.286946 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4784c05a-a7ce-41c3-8066-123ad1404ddb" (UID: "4784c05a-a7ce-41c3-8066-123ad1404ddb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.287916 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.287936 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4784c05a-a7ce-41c3-8066-123ad1404ddb-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.291441 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-scripts" (OuterVolumeSpecName: "scripts") pod "4784c05a-a7ce-41c3-8066-123ad1404ddb" (UID: "4784c05a-a7ce-41c3-8066-123ad1404ddb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.293657 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4784c05a-a7ce-41c3-8066-123ad1404ddb-kube-api-access-8j9b7" (OuterVolumeSpecName: "kube-api-access-8j9b7") pod "4784c05a-a7ce-41c3-8066-123ad1404ddb" (UID: "4784c05a-a7ce-41c3-8066-123ad1404ddb"). InnerVolumeSpecName "kube-api-access-8j9b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.317890 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4784c05a-a7ce-41c3-8066-123ad1404ddb" (UID: "4784c05a-a7ce-41c3-8066-123ad1404ddb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.394974 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4784c05a-a7ce-41c3-8066-123ad1404ddb" (UID: "4784c05a-a7ce-41c3-8066-123ad1404ddb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.395924 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-combined-ca-bundle\") pod \"4784c05a-a7ce-41c3-8066-123ad1404ddb\" (UID: \"4784c05a-a7ce-41c3-8066-123ad1404ddb\") " Dec 15 14:15:18 crc kubenswrapper[4794]: W1215 14:15:18.396223 4794 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4784c05a-a7ce-41c3-8066-123ad1404ddb/volumes/kubernetes.io~secret/combined-ca-bundle Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.396264 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4784c05a-a7ce-41c3-8066-123ad1404ddb" (UID: "4784c05a-a7ce-41c3-8066-123ad1404ddb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.396314 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j9b7\" (UniqueName: \"kubernetes.io/projected/4784c05a-a7ce-41c3-8066-123ad1404ddb-kube-api-access-8j9b7\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.396330 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.396343 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.422626 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-config-data" (OuterVolumeSpecName: "config-data") pod "4784c05a-a7ce-41c3-8066-123ad1404ddb" (UID: "4784c05a-a7ce-41c3-8066-123ad1404ddb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.498071 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:18 crc kubenswrapper[4794]: I1215 14:15:18.498102 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4784c05a-a7ce-41c3-8066-123ad1404ddb-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.048193 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4784c05a-a7ce-41c3-8066-123ad1404ddb","Type":"ContainerDied","Data":"552b3882516f50773ebc166e8ce9c293f03f27735e18fd586c837f0bc84c8bbe"} Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.048570 4794 scope.go:117] "RemoveContainer" containerID="1cfddd9e674d897c23ed707ec16c8425ff64dd574c3e42f5e519b980c4f6f345" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.048814 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.083445 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.090322 4794 scope.go:117] "RemoveContainer" containerID="236b004f1464247e283d1f285952f0752cc36801abf4295c46cb8a6f728391f1" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.094792 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.119406 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:15:19 crc kubenswrapper[4794]: E1215 14:15:19.119738 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e435c3d6-d3cd-45a5-b4f9-d40a1b963208" containerName="collect-profiles" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.119756 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e435c3d6-d3cd-45a5-b4f9-d40a1b963208" containerName="collect-profiles" Dec 15 14:15:19 crc kubenswrapper[4794]: E1215 14:15:19.119770 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="ceilometer-notification-agent" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.119778 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="ceilometer-notification-agent" Dec 15 14:15:19 crc kubenswrapper[4794]: E1215 14:15:19.119800 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="ceilometer-central-agent" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.119807 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="ceilometer-central-agent" Dec 15 14:15:19 crc kubenswrapper[4794]: E1215 14:15:19.119821 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="proxy-httpd" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.119828 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="proxy-httpd" Dec 15 14:15:19 crc kubenswrapper[4794]: E1215 14:15:19.119842 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="sg-core" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.119849 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="sg-core" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.119991 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="sg-core" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.120004 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e435c3d6-d3cd-45a5-b4f9-d40a1b963208" containerName="collect-profiles" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.120017 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="ceilometer-notification-agent" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.120030 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="ceilometer-central-agent" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.120041 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" containerName="proxy-httpd" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.128173 4794 scope.go:117] "RemoveContainer" containerID="539b297155d5db9b3926f69b6f8d22b11c2a5b6e8b366c88c34f31ec9a77a147" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.129092 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.132534 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.132775 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.137022 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.188383 4794 scope.go:117] "RemoveContainer" containerID="5d82e8f49a5cc89b0f10bbc792ee525398f730ebf7dc3da23b9243c7143ab5ab" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.211250 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-log-httpd\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.211304 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-scripts\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.211328 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-config-data\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.211373 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.211429 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.211476 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-run-httpd\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.211496 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5fql\" (UniqueName: \"kubernetes.io/projected/479822cb-595a-4182-a50c-f22a1cff3f17-kube-api-access-s5fql\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.313480 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-run-httpd\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.313966 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5fql\" (UniqueName: \"kubernetes.io/projected/479822cb-595a-4182-a50c-f22a1cff3f17-kube-api-access-s5fql\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.314096 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-log-httpd\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.314194 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-scripts\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.314366 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-config-data\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.314466 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.314108 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-run-httpd\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.314560 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-log-httpd\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.314784 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.319705 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-config-data\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.320497 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-scripts\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.325737 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.329503 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.332894 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5fql\" (UniqueName: \"kubernetes.io/projected/479822cb-595a-4182-a50c-f22a1cff3f17-kube-api-access-s5fql\") pod \"ceilometer-0\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.447038 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:19 crc kubenswrapper[4794]: I1215 14:15:19.888800 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:15:19 crc kubenswrapper[4794]: W1215 14:15:19.893488 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod479822cb_595a_4182_a50c_f22a1cff3f17.slice/crio-84a76fba2b33ec8e285c6fad280bb71b04ffc3ccf209d32b4c68336087e84533 WatchSource:0}: Error finding container 84a76fba2b33ec8e285c6fad280bb71b04ffc3ccf209d32b4c68336087e84533: Status 404 returned error can't find the container with id 84a76fba2b33ec8e285c6fad280bb71b04ffc3ccf209d32b4c68336087e84533 Dec 15 14:15:20 crc kubenswrapper[4794]: I1215 14:15:20.059041 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"479822cb-595a-4182-a50c-f22a1cff3f17","Type":"ContainerStarted","Data":"84a76fba2b33ec8e285c6fad280bb71b04ffc3ccf209d32b4c68336087e84533"} Dec 15 14:15:20 crc kubenswrapper[4794]: I1215 14:15:20.751928 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4784c05a-a7ce-41c3-8066-123ad1404ddb" path="/var/lib/kubelet/pods/4784c05a-a7ce-41c3-8066-123ad1404ddb/volumes" Dec 15 14:15:22 crc kubenswrapper[4794]: I1215 14:15:22.078520 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"479822cb-595a-4182-a50c-f22a1cff3f17","Type":"ContainerStarted","Data":"9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f"} Dec 15 14:15:24 crc kubenswrapper[4794]: I1215 14:15:24.098037 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"479822cb-595a-4182-a50c-f22a1cff3f17","Type":"ContainerStarted","Data":"a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269"} Dec 15 14:15:24 crc kubenswrapper[4794]: I1215 14:15:24.098671 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"479822cb-595a-4182-a50c-f22a1cff3f17","Type":"ContainerStarted","Data":"7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b"} Dec 15 14:15:24 crc kubenswrapper[4794]: I1215 14:15:24.534502 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:15:24 crc kubenswrapper[4794]: I1215 14:15:24.534597 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:15:26 crc kubenswrapper[4794]: I1215 14:15:26.118700 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"479822cb-595a-4182-a50c-f22a1cff3f17","Type":"ContainerStarted","Data":"617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b"} Dec 15 14:15:26 crc kubenswrapper[4794]: I1215 14:15:26.119256 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:26 crc kubenswrapper[4794]: I1215 14:15:26.151191 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.145808461 podStartE2EDuration="7.151167481s" podCreationTimestamp="2025-12-15 14:15:19 +0000 UTC" firstStartedPulling="2025-12-15 14:15:19.895162739 +0000 UTC m=+1281.747185167" lastFinishedPulling="2025-12-15 14:15:24.900521739 +0000 UTC m=+1286.752544187" observedRunningTime="2025-12-15 14:15:26.149812663 +0000 UTC m=+1288.001835151" watchObservedRunningTime="2025-12-15 14:15:26.151167481 +0000 UTC m=+1288.003189979" Dec 15 14:15:39 crc kubenswrapper[4794]: I1215 14:15:39.641547 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.713249 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstackclient"] Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.714725 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.716898 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstack-config-secret" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.719547 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.727810 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.728540 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstackclient-openstackclient-dockercfg-k8b88" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.805783 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrf6\" (UniqueName: \"kubernetes.io/projected/5b0af393-3fce-4ed9-921e-9b56d16f4b02-kube-api-access-cnrf6\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.805898 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b0af393-3fce-4ed9-921e-9b56d16f4b02-openstack-config\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.805921 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b0af393-3fce-4ed9-921e-9b56d16f4b02-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.805997 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0af393-3fce-4ed9-921e-9b56d16f4b02-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.908378 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b0af393-3fce-4ed9-921e-9b56d16f4b02-openstack-config\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.908422 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b0af393-3fce-4ed9-921e-9b56d16f4b02-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.908866 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0af393-3fce-4ed9-921e-9b56d16f4b02-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.909154 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrf6\" (UniqueName: \"kubernetes.io/projected/5b0af393-3fce-4ed9-921e-9b56d16f4b02-kube-api-access-cnrf6\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.909785 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b0af393-3fce-4ed9-921e-9b56d16f4b02-openstack-config\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.914018 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0af393-3fce-4ed9-921e-9b56d16f4b02-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.915395 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b0af393-3fce-4ed9-921e-9b56d16f4b02-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:42 crc kubenswrapper[4794]: I1215 14:15:42.927997 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrf6\" (UniqueName: \"kubernetes.io/projected/5b0af393-3fce-4ed9-921e-9b56d16f4b02-kube-api-access-cnrf6\") pod \"openstackclient\" (UID: \"5b0af393-3fce-4ed9-921e-9b56d16f4b02\") " pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:43 crc kubenswrapper[4794]: I1215 14:15:43.065760 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Dec 15 14:15:43 crc kubenswrapper[4794]: I1215 14:15:43.483745 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Dec 15 14:15:43 crc kubenswrapper[4794]: W1215 14:15:43.488262 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b0af393_3fce_4ed9_921e_9b56d16f4b02.slice/crio-36117d8846fc83f5e86e2d3e58b5cea4a2cfbb107d8c88a6cb0aade257ad0820 WatchSource:0}: Error finding container 36117d8846fc83f5e86e2d3e58b5cea4a2cfbb107d8c88a6cb0aade257ad0820: Status 404 returned error can't find the container with id 36117d8846fc83f5e86e2d3e58b5cea4a2cfbb107d8c88a6cb0aade257ad0820 Dec 15 14:15:44 crc kubenswrapper[4794]: I1215 14:15:44.269733 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"5b0af393-3fce-4ed9-921e-9b56d16f4b02","Type":"ContainerStarted","Data":"36117d8846fc83f5e86e2d3e58b5cea4a2cfbb107d8c88a6cb0aade257ad0820"} Dec 15 14:15:49 crc kubenswrapper[4794]: I1215 14:15:49.455025 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:51 crc kubenswrapper[4794]: I1215 14:15:51.880319 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 15 14:15:51 crc kubenswrapper[4794]: I1215 14:15:51.881010 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="de5ca28f-5594-46bd-9bf6-373770d3b9bb" containerName="kube-state-metrics" containerID="cri-o://fefa8a802595cff965a530fcb0f20698d23e013791df0a1674fec5880286a3f6" gracePeriod=30 Dec 15 14:15:52 crc kubenswrapper[4794]: I1215 14:15:52.329853 4794 generic.go:334] "Generic (PLEG): container finished" podID="de5ca28f-5594-46bd-9bf6-373770d3b9bb" containerID="fefa8a802595cff965a530fcb0f20698d23e013791df0a1674fec5880286a3f6" exitCode=2 Dec 15 14:15:52 crc kubenswrapper[4794]: I1215 14:15:52.329938 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"de5ca28f-5594-46bd-9bf6-373770d3b9bb","Type":"ContainerDied","Data":"fefa8a802595cff965a530fcb0f20698d23e013791df0a1674fec5880286a3f6"} Dec 15 14:15:52 crc kubenswrapper[4794]: I1215 14:15:52.947966 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:15:52 crc kubenswrapper[4794]: I1215 14:15:52.948238 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="ceilometer-central-agent" containerID="cri-o://9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f" gracePeriod=30 Dec 15 14:15:52 crc kubenswrapper[4794]: I1215 14:15:52.948298 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="sg-core" containerID="cri-o://a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269" gracePeriod=30 Dec 15 14:15:52 crc kubenswrapper[4794]: I1215 14:15:52.948370 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="ceilometer-notification-agent" containerID="cri-o://7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b" gracePeriod=30 Dec 15 14:15:52 crc kubenswrapper[4794]: I1215 14:15:52.948366 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="proxy-httpd" containerID="cri-o://617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b" gracePeriod=30 Dec 15 14:15:53 crc kubenswrapper[4794]: I1215 14:15:53.341888 4794 generic.go:334] "Generic (PLEG): container finished" podID="479822cb-595a-4182-a50c-f22a1cff3f17" containerID="617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b" exitCode=0 Dec 15 14:15:53 crc kubenswrapper[4794]: I1215 14:15:53.341924 4794 generic.go:334] "Generic (PLEG): container finished" podID="479822cb-595a-4182-a50c-f22a1cff3f17" containerID="a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269" exitCode=2 Dec 15 14:15:53 crc kubenswrapper[4794]: I1215 14:15:53.341947 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"479822cb-595a-4182-a50c-f22a1cff3f17","Type":"ContainerDied","Data":"617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b"} Dec 15 14:15:53 crc kubenswrapper[4794]: I1215 14:15:53.341977 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"479822cb-595a-4182-a50c-f22a1cff3f17","Type":"ContainerDied","Data":"a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269"} Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.098095 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.134939 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-576fj\" (UniqueName: \"kubernetes.io/projected/de5ca28f-5594-46bd-9bf6-373770d3b9bb-kube-api-access-576fj\") pod \"de5ca28f-5594-46bd-9bf6-373770d3b9bb\" (UID: \"de5ca28f-5594-46bd-9bf6-373770d3b9bb\") " Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.151909 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5ca28f-5594-46bd-9bf6-373770d3b9bb-kube-api-access-576fj" (OuterVolumeSpecName: "kube-api-access-576fj") pod "de5ca28f-5594-46bd-9bf6-373770d3b9bb" (UID: "de5ca28f-5594-46bd-9bf6-373770d3b9bb"). InnerVolumeSpecName "kube-api-access-576fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.236478 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-576fj\" (UniqueName: \"kubernetes.io/projected/de5ca28f-5594-46bd-9bf6-373770d3b9bb-kube-api-access-576fj\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.266612 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.337608 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-scripts\") pod \"479822cb-595a-4182-a50c-f22a1cff3f17\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.337660 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-config-data\") pod \"479822cb-595a-4182-a50c-f22a1cff3f17\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.337689 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-log-httpd\") pod \"479822cb-595a-4182-a50c-f22a1cff3f17\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.337743 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-run-httpd\") pod \"479822cb-595a-4182-a50c-f22a1cff3f17\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.337767 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5fql\" (UniqueName: \"kubernetes.io/projected/479822cb-595a-4182-a50c-f22a1cff3f17-kube-api-access-s5fql\") pod \"479822cb-595a-4182-a50c-f22a1cff3f17\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.337819 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-combined-ca-bundle\") pod \"479822cb-595a-4182-a50c-f22a1cff3f17\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.337835 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-sg-core-conf-yaml\") pod \"479822cb-595a-4182-a50c-f22a1cff3f17\" (UID: \"479822cb-595a-4182-a50c-f22a1cff3f17\") " Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.338359 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "479822cb-595a-4182-a50c-f22a1cff3f17" (UID: "479822cb-595a-4182-a50c-f22a1cff3f17"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.341904 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "479822cb-595a-4182-a50c-f22a1cff3f17" (UID: "479822cb-595a-4182-a50c-f22a1cff3f17"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.343371 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-scripts" (OuterVolumeSpecName: "scripts") pod "479822cb-595a-4182-a50c-f22a1cff3f17" (UID: "479822cb-595a-4182-a50c-f22a1cff3f17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.346933 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479822cb-595a-4182-a50c-f22a1cff3f17-kube-api-access-s5fql" (OuterVolumeSpecName: "kube-api-access-s5fql") pod "479822cb-595a-4182-a50c-f22a1cff3f17" (UID: "479822cb-595a-4182-a50c-f22a1cff3f17"). InnerVolumeSpecName "kube-api-access-s5fql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.354028 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"de5ca28f-5594-46bd-9bf6-373770d3b9bb","Type":"ContainerDied","Data":"fc4761e89e6afaad483d0432f2a46b70e88fd1d9305ea2750f3691af6f0a260d"} Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.354072 4794 scope.go:117] "RemoveContainer" containerID="fefa8a802595cff965a530fcb0f20698d23e013791df0a1674fec5880286a3f6" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.354165 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.357811 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"5b0af393-3fce-4ed9-921e-9b56d16f4b02","Type":"ContainerStarted","Data":"4ca5da1e259aab67e58a7116834e7b781c35d3254edaf58fc9b2668c77ed4e1f"} Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.373544 4794 generic.go:334] "Generic (PLEG): container finished" podID="479822cb-595a-4182-a50c-f22a1cff3f17" containerID="7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b" exitCode=0 Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.373677 4794 generic.go:334] "Generic (PLEG): container finished" podID="479822cb-595a-4182-a50c-f22a1cff3f17" containerID="9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f" exitCode=0 Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.373729 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"479822cb-595a-4182-a50c-f22a1cff3f17","Type":"ContainerDied","Data":"7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b"} Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.373794 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"479822cb-595a-4182-a50c-f22a1cff3f17","Type":"ContainerDied","Data":"9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f"} Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.373813 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"479822cb-595a-4182-a50c-f22a1cff3f17","Type":"ContainerDied","Data":"84a76fba2b33ec8e285c6fad280bb71b04ffc3ccf209d32b4c68336087e84533"} Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.373946 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.383783 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "479822cb-595a-4182-a50c-f22a1cff3f17" (UID: "479822cb-595a-4182-a50c-f22a1cff3f17"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.384050 4794 scope.go:117] "RemoveContainer" containerID="617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.388960 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstackclient" podStartSLOduration=2.143921379 podStartE2EDuration="12.388944024s" podCreationTimestamp="2025-12-15 14:15:42 +0000 UTC" firstStartedPulling="2025-12-15 14:15:43.490018046 +0000 UTC m=+1305.342040494" lastFinishedPulling="2025-12-15 14:15:53.735040691 +0000 UTC m=+1315.587063139" observedRunningTime="2025-12-15 14:15:54.387779402 +0000 UTC m=+1316.239801840" watchObservedRunningTime="2025-12-15 14:15:54.388944024 +0000 UTC m=+1316.240966462" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.405713 4794 scope.go:117] "RemoveContainer" containerID="a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.414260 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.429332 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.437765 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "479822cb-595a-4182-a50c-f22a1cff3f17" (UID: "479822cb-595a-4182-a50c-f22a1cff3f17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.439566 4794 scope.go:117] "RemoveContainer" containerID="7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.440152 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.440180 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.440194 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/479822cb-595a-4182-a50c-f22a1cff3f17-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.440204 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5fql\" (UniqueName: \"kubernetes.io/projected/479822cb-595a-4182-a50c-f22a1cff3f17-kube-api-access-s5fql\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.440213 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.440222 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.440668 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 15 14:15:54 crc kubenswrapper[4794]: E1215 14:15:54.441071 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="ceilometer-notification-agent" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.441196 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="ceilometer-notification-agent" Dec 15 14:15:54 crc kubenswrapper[4794]: E1215 14:15:54.441212 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5ca28f-5594-46bd-9bf6-373770d3b9bb" containerName="kube-state-metrics" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.441219 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5ca28f-5594-46bd-9bf6-373770d3b9bb" containerName="kube-state-metrics" Dec 15 14:15:54 crc kubenswrapper[4794]: E1215 14:15:54.441233 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="sg-core" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.441240 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="sg-core" Dec 15 14:15:54 crc kubenswrapper[4794]: E1215 14:15:54.441264 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="proxy-httpd" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.441270 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="proxy-httpd" Dec 15 14:15:54 crc kubenswrapper[4794]: E1215 14:15:54.441282 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="ceilometer-central-agent" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.441287 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="ceilometer-central-agent" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.441444 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="ceilometer-notification-agent" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.441465 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="proxy-httpd" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.441475 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5ca28f-5594-46bd-9bf6-373770d3b9bb" containerName="kube-state-metrics" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.441484 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="sg-core" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.441493 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" containerName="ceilometer-central-agent" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.442483 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.458066 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"kube-state-metrics-tls-config" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.465793 4794 scope.go:117] "RemoveContainer" containerID="9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.466157 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-kube-state-metrics-svc" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.472929 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.480319 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-config-data" (OuterVolumeSpecName: "config-data") pod "479822cb-595a-4182-a50c-f22a1cff3f17" (UID: "479822cb-595a-4182-a50c-f22a1cff3f17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.503442 4794 scope.go:117] "RemoveContainer" containerID="617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b" Dec 15 14:15:54 crc kubenswrapper[4794]: E1215 14:15:54.503789 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b\": container with ID starting with 617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b not found: ID does not exist" containerID="617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.503820 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b"} err="failed to get container status \"617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b\": rpc error: code = NotFound desc = could not find container \"617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b\": container with ID starting with 617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b not found: ID does not exist" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.503839 4794 scope.go:117] "RemoveContainer" containerID="a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269" Dec 15 14:15:54 crc kubenswrapper[4794]: E1215 14:15:54.504144 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269\": container with ID starting with a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269 not found: ID does not exist" containerID="a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.504187 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269"} err="failed to get container status \"a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269\": rpc error: code = NotFound desc = could not find container \"a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269\": container with ID starting with a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269 not found: ID does not exist" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.504220 4794 scope.go:117] "RemoveContainer" containerID="7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b" Dec 15 14:15:54 crc kubenswrapper[4794]: E1215 14:15:54.504790 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b\": container with ID starting with 7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b not found: ID does not exist" containerID="7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.504827 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b"} err="failed to get container status \"7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b\": rpc error: code = NotFound desc = could not find container \"7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b\": container with ID starting with 7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b not found: ID does not exist" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.504844 4794 scope.go:117] "RemoveContainer" containerID="9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f" Dec 15 14:15:54 crc kubenswrapper[4794]: E1215 14:15:54.508911 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f\": container with ID starting with 9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f not found: ID does not exist" containerID="9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.508939 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f"} err="failed to get container status \"9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f\": rpc error: code = NotFound desc = could not find container \"9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f\": container with ID starting with 9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f not found: ID does not exist" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.508955 4794 scope.go:117] "RemoveContainer" containerID="617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.509299 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b"} err="failed to get container status \"617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b\": rpc error: code = NotFound desc = could not find container \"617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b\": container with ID starting with 617e8594caee3032e917da274a3801ac54be89a56b2658e55b40380ba190aa6b not found: ID does not exist" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.509334 4794 scope.go:117] "RemoveContainer" containerID="a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.509669 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269"} err="failed to get container status \"a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269\": rpc error: code = NotFound desc = could not find container \"a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269\": container with ID starting with a17ce349b2d9e8258cba044b3fb97f4df2ef890c4a76da2b3ee2eac702877269 not found: ID does not exist" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.509694 4794 scope.go:117] "RemoveContainer" containerID="7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.513747 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b"} err="failed to get container status \"7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b\": rpc error: code = NotFound desc = could not find container \"7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b\": container with ID starting with 7794387aadd9a919e9f54e33b154953fcc9741ad5fbf5402f5db467787ece59b not found: ID does not exist" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.513765 4794 scope.go:117] "RemoveContainer" containerID="9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.514112 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f"} err="failed to get container status \"9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f\": rpc error: code = NotFound desc = could not find container \"9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f\": container with ID starting with 9a5afe6516acf4cc1c7d6eaeba179c3c42f6bb7f496c1a72b0d49c44ebcc5f9f not found: ID does not exist" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.533637 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.533687 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.541881 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aa63f8b-acce-4400-8f15-8aafe966738f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.541927 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5aa63f8b-acce-4400-8f15-8aafe966738f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.541965 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9dcb\" (UniqueName: \"kubernetes.io/projected/5aa63f8b-acce-4400-8f15-8aafe966738f-kube-api-access-q9dcb\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.542025 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa63f8b-acce-4400-8f15-8aafe966738f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.542122 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479822cb-595a-4182-a50c-f22a1cff3f17-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.643690 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa63f8b-acce-4400-8f15-8aafe966738f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.643810 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aa63f8b-acce-4400-8f15-8aafe966738f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.643833 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5aa63f8b-acce-4400-8f15-8aafe966738f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.643873 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9dcb\" (UniqueName: \"kubernetes.io/projected/5aa63f8b-acce-4400-8f15-8aafe966738f-kube-api-access-q9dcb\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.653819 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa63f8b-acce-4400-8f15-8aafe966738f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.654127 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aa63f8b-acce-4400-8f15-8aafe966738f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.654291 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5aa63f8b-acce-4400-8f15-8aafe966738f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.665686 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9dcb\" (UniqueName: \"kubernetes.io/projected/5aa63f8b-acce-4400-8f15-8aafe966738f-kube-api-access-q9dcb\") pod \"kube-state-metrics-0\" (UID: \"5aa63f8b-acce-4400-8f15-8aafe966738f\") " pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.751858 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de5ca28f-5594-46bd-9bf6-373770d3b9bb" path="/var/lib/kubelet/pods/de5ca28f-5594-46bd-9bf6-373770d3b9bb/volumes" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.752391 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.752423 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.767144 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.772857 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.774671 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.776304 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.778447 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.778667 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.790084 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.845758 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-config-data\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.845845 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.845869 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-run-httpd\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.845895 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-log-httpd\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.845934 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.845988 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-scripts\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.846023 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wssnj\" (UniqueName: \"kubernetes.io/projected/29590762-7864-464f-984f-4626c6566a08-kube-api-access-wssnj\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.846066 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.948278 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-scripts\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.948352 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wssnj\" (UniqueName: \"kubernetes.io/projected/29590762-7864-464f-984f-4626c6566a08-kube-api-access-wssnj\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.948402 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.948437 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-config-data\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.948494 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.948513 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-run-httpd\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.948536 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-log-httpd\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.948571 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.950506 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-run-httpd\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.950935 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-log-httpd\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.954629 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.956862 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-scripts\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.958667 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.958675 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-config-data\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.964967 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:54 crc kubenswrapper[4794]: I1215 14:15:54.971054 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wssnj\" (UniqueName: \"kubernetes.io/projected/29590762-7864-464f-984f-4626c6566a08-kube-api-access-wssnj\") pod \"ceilometer-0\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:55 crc kubenswrapper[4794]: I1215 14:15:55.077324 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Dec 15 14:15:55 crc kubenswrapper[4794]: I1215 14:15:55.158878 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:15:55 crc kubenswrapper[4794]: I1215 14:15:55.387441 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"5aa63f8b-acce-4400-8f15-8aafe966738f","Type":"ContainerStarted","Data":"fe2c35491472134525f1d86b9b460074f99d2bc77db59d486211ecca69e709d8"} Dec 15 14:15:55 crc kubenswrapper[4794]: I1215 14:15:55.603829 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:15:55 crc kubenswrapper[4794]: W1215 14:15:55.605020 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29590762_7864_464f_984f_4626c6566a08.slice/crio-5d0e4c2df76ceade9cf852f1f009d74c9421823ee41791af51d93180a23c951d WatchSource:0}: Error finding container 5d0e4c2df76ceade9cf852f1f009d74c9421823ee41791af51d93180a23c951d: Status 404 returned error can't find the container with id 5d0e4c2df76ceade9cf852f1f009d74c9421823ee41791af51d93180a23c951d Dec 15 14:15:56 crc kubenswrapper[4794]: I1215 14:15:56.404357 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"5aa63f8b-acce-4400-8f15-8aafe966738f","Type":"ContainerStarted","Data":"39d2bba25c6de67dea14829a575dfcb16883a37412180b5623f33325045a6279"} Dec 15 14:15:56 crc kubenswrapper[4794]: I1215 14:15:56.404713 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:15:56 crc kubenswrapper[4794]: I1215 14:15:56.406031 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"29590762-7864-464f-984f-4626c6566a08","Type":"ContainerStarted","Data":"5d0e4c2df76ceade9cf852f1f009d74c9421823ee41791af51d93180a23c951d"} Dec 15 14:15:56 crc kubenswrapper[4794]: I1215 14:15:56.431887 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=2.070851113 podStartE2EDuration="2.431867187s" podCreationTimestamp="2025-12-15 14:15:54 +0000 UTC" firstStartedPulling="2025-12-15 14:15:55.080311645 +0000 UTC m=+1316.932334083" lastFinishedPulling="2025-12-15 14:15:55.441327719 +0000 UTC m=+1317.293350157" observedRunningTime="2025-12-15 14:15:56.423481481 +0000 UTC m=+1318.275503979" watchObservedRunningTime="2025-12-15 14:15:56.431867187 +0000 UTC m=+1318.283889635" Dec 15 14:15:56 crc kubenswrapper[4794]: I1215 14:15:56.747616 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479822cb-595a-4182-a50c-f22a1cff3f17" path="/var/lib/kubelet/pods/479822cb-595a-4182-a50c-f22a1cff3f17/volumes" Dec 15 14:15:57 crc kubenswrapper[4794]: I1215 14:15:57.415228 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"29590762-7864-464f-984f-4626c6566a08","Type":"ContainerStarted","Data":"6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89"} Dec 15 14:15:57 crc kubenswrapper[4794]: I1215 14:15:57.415534 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"29590762-7864-464f-984f-4626c6566a08","Type":"ContainerStarted","Data":"72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd"} Dec 15 14:15:58 crc kubenswrapper[4794]: I1215 14:15:58.215694 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-d2tl4"] Dec 15 14:15:58 crc kubenswrapper[4794]: I1215 14:15:58.217174 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-d2tl4" Dec 15 14:15:58 crc kubenswrapper[4794]: I1215 14:15:58.226079 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-d2tl4"] Dec 15 14:15:58 crc kubenswrapper[4794]: I1215 14:15:58.402903 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brb5n\" (UniqueName: \"kubernetes.io/projected/d9717ec4-73dc-4eea-8d0b-69e55280c21f-kube-api-access-brb5n\") pod \"watcher-db-create-d2tl4\" (UID: \"d9717ec4-73dc-4eea-8d0b-69e55280c21f\") " pod="watcher-kuttl-default/watcher-db-create-d2tl4" Dec 15 14:15:58 crc kubenswrapper[4794]: I1215 14:15:58.424446 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"29590762-7864-464f-984f-4626c6566a08","Type":"ContainerStarted","Data":"6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5"} Dec 15 14:15:58 crc kubenswrapper[4794]: I1215 14:15:58.504906 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brb5n\" (UniqueName: \"kubernetes.io/projected/d9717ec4-73dc-4eea-8d0b-69e55280c21f-kube-api-access-brb5n\") pod \"watcher-db-create-d2tl4\" (UID: \"d9717ec4-73dc-4eea-8d0b-69e55280c21f\") " pod="watcher-kuttl-default/watcher-db-create-d2tl4" Dec 15 14:15:58 crc kubenswrapper[4794]: I1215 14:15:58.529370 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brb5n\" (UniqueName: \"kubernetes.io/projected/d9717ec4-73dc-4eea-8d0b-69e55280c21f-kube-api-access-brb5n\") pod \"watcher-db-create-d2tl4\" (UID: \"d9717ec4-73dc-4eea-8d0b-69e55280c21f\") " pod="watcher-kuttl-default/watcher-db-create-d2tl4" Dec 15 14:15:58 crc kubenswrapper[4794]: I1215 14:15:58.538601 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-d2tl4" Dec 15 14:15:58 crc kubenswrapper[4794]: I1215 14:15:58.987378 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-d2tl4"] Dec 15 14:15:58 crc kubenswrapper[4794]: W1215 14:15:58.992053 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9717ec4_73dc_4eea_8d0b_69e55280c21f.slice/crio-be8b1e5939cc2a8e13503874d9fa56fea75bed70385db16ad2a7a03088fa6018 WatchSource:0}: Error finding container be8b1e5939cc2a8e13503874d9fa56fea75bed70385db16ad2a7a03088fa6018: Status 404 returned error can't find the container with id be8b1e5939cc2a8e13503874d9fa56fea75bed70385db16ad2a7a03088fa6018 Dec 15 14:15:59 crc kubenswrapper[4794]: I1215 14:15:59.438886 4794 generic.go:334] "Generic (PLEG): container finished" podID="d9717ec4-73dc-4eea-8d0b-69e55280c21f" containerID="8d62df96eb5f27da9a0c57c2145a0f3b9018b8526678106afcd4ff5a703fe513" exitCode=0 Dec 15 14:15:59 crc kubenswrapper[4794]: I1215 14:15:59.438939 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-d2tl4" event={"ID":"d9717ec4-73dc-4eea-8d0b-69e55280c21f","Type":"ContainerDied","Data":"8d62df96eb5f27da9a0c57c2145a0f3b9018b8526678106afcd4ff5a703fe513"} Dec 15 14:15:59 crc kubenswrapper[4794]: I1215 14:15:59.439287 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-d2tl4" event={"ID":"d9717ec4-73dc-4eea-8d0b-69e55280c21f","Type":"ContainerStarted","Data":"be8b1e5939cc2a8e13503874d9fa56fea75bed70385db16ad2a7a03088fa6018"} Dec 15 14:16:00 crc kubenswrapper[4794]: I1215 14:16:00.450387 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"29590762-7864-464f-984f-4626c6566a08","Type":"ContainerStarted","Data":"68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b"} Dec 15 14:16:00 crc kubenswrapper[4794]: I1215 14:16:00.486683 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.941712993 podStartE2EDuration="6.486658067s" podCreationTimestamp="2025-12-15 14:15:54 +0000 UTC" firstStartedPulling="2025-12-15 14:15:55.607668755 +0000 UTC m=+1317.459691193" lastFinishedPulling="2025-12-15 14:15:59.152613829 +0000 UTC m=+1321.004636267" observedRunningTime="2025-12-15 14:16:00.482888551 +0000 UTC m=+1322.334910999" watchObservedRunningTime="2025-12-15 14:16:00.486658067 +0000 UTC m=+1322.338680505" Dec 15 14:16:00 crc kubenswrapper[4794]: I1215 14:16:00.883940 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-d2tl4" Dec 15 14:16:01 crc kubenswrapper[4794]: I1215 14:16:01.057866 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brb5n\" (UniqueName: \"kubernetes.io/projected/d9717ec4-73dc-4eea-8d0b-69e55280c21f-kube-api-access-brb5n\") pod \"d9717ec4-73dc-4eea-8d0b-69e55280c21f\" (UID: \"d9717ec4-73dc-4eea-8d0b-69e55280c21f\") " Dec 15 14:16:01 crc kubenswrapper[4794]: I1215 14:16:01.068378 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9717ec4-73dc-4eea-8d0b-69e55280c21f-kube-api-access-brb5n" (OuterVolumeSpecName: "kube-api-access-brb5n") pod "d9717ec4-73dc-4eea-8d0b-69e55280c21f" (UID: "d9717ec4-73dc-4eea-8d0b-69e55280c21f"). InnerVolumeSpecName "kube-api-access-brb5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:16:01 crc kubenswrapper[4794]: I1215 14:16:01.159938 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brb5n\" (UniqueName: \"kubernetes.io/projected/d9717ec4-73dc-4eea-8d0b-69e55280c21f-kube-api-access-brb5n\") on node \"crc\" DevicePath \"\"" Dec 15 14:16:01 crc kubenswrapper[4794]: I1215 14:16:01.460504 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-d2tl4" event={"ID":"d9717ec4-73dc-4eea-8d0b-69e55280c21f","Type":"ContainerDied","Data":"be8b1e5939cc2a8e13503874d9fa56fea75bed70385db16ad2a7a03088fa6018"} Dec 15 14:16:01 crc kubenswrapper[4794]: I1215 14:16:01.460549 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be8b1e5939cc2a8e13503874d9fa56fea75bed70385db16ad2a7a03088fa6018" Dec 15 14:16:01 crc kubenswrapper[4794]: I1215 14:16:01.460523 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-d2tl4" Dec 15 14:16:01 crc kubenswrapper[4794]: I1215 14:16:01.460761 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:16:04 crc kubenswrapper[4794]: I1215 14:16:04.774027 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Dec 15 14:16:08 crc kubenswrapper[4794]: I1215 14:16:08.311181 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-258e-account-create-tk96s"] Dec 15 14:16:08 crc kubenswrapper[4794]: E1215 14:16:08.311904 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9717ec4-73dc-4eea-8d0b-69e55280c21f" containerName="mariadb-database-create" Dec 15 14:16:08 crc kubenswrapper[4794]: I1215 14:16:08.311920 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9717ec4-73dc-4eea-8d0b-69e55280c21f" containerName="mariadb-database-create" Dec 15 14:16:08 crc kubenswrapper[4794]: I1215 14:16:08.312172 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9717ec4-73dc-4eea-8d0b-69e55280c21f" containerName="mariadb-database-create" Dec 15 14:16:08 crc kubenswrapper[4794]: I1215 14:16:08.312826 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-258e-account-create-tk96s" Dec 15 14:16:08 crc kubenswrapper[4794]: I1215 14:16:08.314895 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 15 14:16:08 crc kubenswrapper[4794]: I1215 14:16:08.325715 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-258e-account-create-tk96s"] Dec 15 14:16:08 crc kubenswrapper[4794]: I1215 14:16:08.471379 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2wdc\" (UniqueName: \"kubernetes.io/projected/3412d7d8-45cd-4978-93be-103f1605bc49-kube-api-access-p2wdc\") pod \"watcher-258e-account-create-tk96s\" (UID: \"3412d7d8-45cd-4978-93be-103f1605bc49\") " pod="watcher-kuttl-default/watcher-258e-account-create-tk96s" Dec 15 14:16:08 crc kubenswrapper[4794]: I1215 14:16:08.572366 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2wdc\" (UniqueName: \"kubernetes.io/projected/3412d7d8-45cd-4978-93be-103f1605bc49-kube-api-access-p2wdc\") pod \"watcher-258e-account-create-tk96s\" (UID: \"3412d7d8-45cd-4978-93be-103f1605bc49\") " pod="watcher-kuttl-default/watcher-258e-account-create-tk96s" Dec 15 14:16:08 crc kubenswrapper[4794]: I1215 14:16:08.599668 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2wdc\" (UniqueName: \"kubernetes.io/projected/3412d7d8-45cd-4978-93be-103f1605bc49-kube-api-access-p2wdc\") pod \"watcher-258e-account-create-tk96s\" (UID: \"3412d7d8-45cd-4978-93be-103f1605bc49\") " pod="watcher-kuttl-default/watcher-258e-account-create-tk96s" Dec 15 14:16:08 crc kubenswrapper[4794]: I1215 14:16:08.634902 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-258e-account-create-tk96s" Dec 15 14:16:09 crc kubenswrapper[4794]: I1215 14:16:09.154197 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-258e-account-create-tk96s"] Dec 15 14:16:09 crc kubenswrapper[4794]: I1215 14:16:09.529105 4794 generic.go:334] "Generic (PLEG): container finished" podID="3412d7d8-45cd-4978-93be-103f1605bc49" containerID="6b6665900b5f3994153d359afe69be57e774fe4797b3d08234da9b2d3e47ad19" exitCode=0 Dec 15 14:16:09 crc kubenswrapper[4794]: I1215 14:16:09.529441 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-258e-account-create-tk96s" event={"ID":"3412d7d8-45cd-4978-93be-103f1605bc49","Type":"ContainerDied","Data":"6b6665900b5f3994153d359afe69be57e774fe4797b3d08234da9b2d3e47ad19"} Dec 15 14:16:09 crc kubenswrapper[4794]: I1215 14:16:09.529472 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-258e-account-create-tk96s" event={"ID":"3412d7d8-45cd-4978-93be-103f1605bc49","Type":"ContainerStarted","Data":"4a0e1d06e6ac828b4a7e52019537fbc5fd6520187746687830bf4140fa4fa0f6"} Dec 15 14:16:10 crc kubenswrapper[4794]: I1215 14:16:10.855480 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-258e-account-create-tk96s" Dec 15 14:16:11 crc kubenswrapper[4794]: I1215 14:16:11.040914 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2wdc\" (UniqueName: \"kubernetes.io/projected/3412d7d8-45cd-4978-93be-103f1605bc49-kube-api-access-p2wdc\") pod \"3412d7d8-45cd-4978-93be-103f1605bc49\" (UID: \"3412d7d8-45cd-4978-93be-103f1605bc49\") " Dec 15 14:16:11 crc kubenswrapper[4794]: I1215 14:16:11.047301 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3412d7d8-45cd-4978-93be-103f1605bc49-kube-api-access-p2wdc" (OuterVolumeSpecName: "kube-api-access-p2wdc") pod "3412d7d8-45cd-4978-93be-103f1605bc49" (UID: "3412d7d8-45cd-4978-93be-103f1605bc49"). InnerVolumeSpecName "kube-api-access-p2wdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:16:11 crc kubenswrapper[4794]: I1215 14:16:11.143816 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2wdc\" (UniqueName: \"kubernetes.io/projected/3412d7d8-45cd-4978-93be-103f1605bc49-kube-api-access-p2wdc\") on node \"crc\" DevicePath \"\"" Dec 15 14:16:11 crc kubenswrapper[4794]: I1215 14:16:11.546652 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-258e-account-create-tk96s" event={"ID":"3412d7d8-45cd-4978-93be-103f1605bc49","Type":"ContainerDied","Data":"4a0e1d06e6ac828b4a7e52019537fbc5fd6520187746687830bf4140fa4fa0f6"} Dec 15 14:16:11 crc kubenswrapper[4794]: I1215 14:16:11.546707 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a0e1d06e6ac828b4a7e52019537fbc5fd6520187746687830bf4140fa4fa0f6" Dec 15 14:16:11 crc kubenswrapper[4794]: I1215 14:16:11.546763 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-258e-account-create-tk96s" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.589816 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8"] Dec 15 14:16:13 crc kubenswrapper[4794]: E1215 14:16:13.590412 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3412d7d8-45cd-4978-93be-103f1605bc49" containerName="mariadb-account-create" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.590423 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3412d7d8-45cd-4978-93be-103f1605bc49" containerName="mariadb-account-create" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.590568 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3412d7d8-45cd-4978-93be-103f1605bc49" containerName="mariadb-account-create" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.591120 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.594014 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.594154 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-sl9k7" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.602064 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8"] Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.784363 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.784485 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-config-data\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.784523 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-db-sync-config-data\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.784543 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68flv\" (UniqueName: \"kubernetes.io/projected/3defa926-87a9-4d6b-9af8-c1d9c591ca38-kube-api-access-68flv\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.885968 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-config-data\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.886026 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-db-sync-config-data\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.886048 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68flv\" (UniqueName: \"kubernetes.io/projected/3defa926-87a9-4d6b-9af8-c1d9c591ca38-kube-api-access-68flv\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.886071 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.891681 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-config-data\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.891819 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.898180 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-db-sync-config-data\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.906873 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68flv\" (UniqueName: \"kubernetes.io/projected/3defa926-87a9-4d6b-9af8-c1d9c591ca38-kube-api-access-68flv\") pod \"watcher-kuttl-db-sync-rwqd8\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:13 crc kubenswrapper[4794]: I1215 14:16:13.915329 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:14 crc kubenswrapper[4794]: I1215 14:16:14.258057 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8"] Dec 15 14:16:14 crc kubenswrapper[4794]: W1215 14:16:14.262872 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3defa926_87a9_4d6b_9af8_c1d9c591ca38.slice/crio-0a60b738e27de37aff465c147985257cb38424c6ce55471c0d8e98625f961b9e WatchSource:0}: Error finding container 0a60b738e27de37aff465c147985257cb38424c6ce55471c0d8e98625f961b9e: Status 404 returned error can't find the container with id 0a60b738e27de37aff465c147985257cb38424c6ce55471c0d8e98625f961b9e Dec 15 14:16:14 crc kubenswrapper[4794]: I1215 14:16:14.582895 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" event={"ID":"3defa926-87a9-4d6b-9af8-c1d9c591ca38","Type":"ContainerStarted","Data":"0a60b738e27de37aff465c147985257cb38424c6ce55471c0d8e98625f961b9e"} Dec 15 14:16:24 crc kubenswrapper[4794]: I1215 14:16:24.534253 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:16:24 crc kubenswrapper[4794]: I1215 14:16:24.535345 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:16:24 crc kubenswrapper[4794]: I1215 14:16:24.535427 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 14:16:24 crc kubenswrapper[4794]: I1215 14:16:24.536691 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29f0e3d8aa7777138c2e7242f8fdf176d9b72c162c2a35610e7f08f0da28f0c2"} pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 15 14:16:24 crc kubenswrapper[4794]: I1215 14:16:24.536765 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" containerID="cri-o://29f0e3d8aa7777138c2e7242f8fdf176d9b72c162c2a35610e7f08f0da28f0c2" gracePeriod=600 Dec 15 14:16:25 crc kubenswrapper[4794]: I1215 14:16:25.169677 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:16:25 crc kubenswrapper[4794]: I1215 14:16:25.681340 4794 generic.go:334] "Generic (PLEG): container finished" podID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerID="29f0e3d8aa7777138c2e7242f8fdf176d9b72c162c2a35610e7f08f0da28f0c2" exitCode=0 Dec 15 14:16:25 crc kubenswrapper[4794]: I1215 14:16:25.681377 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerDied","Data":"29f0e3d8aa7777138c2e7242f8fdf176d9b72c162c2a35610e7f08f0da28f0c2"} Dec 15 14:16:25 crc kubenswrapper[4794]: I1215 14:16:25.681457 4794 scope.go:117] "RemoveContainer" containerID="69cd9719316ef2e80637a18253154a420b07d0d2cd576a13b5c600dba7da07e9" Dec 15 14:16:32 crc kubenswrapper[4794]: E1215 14:16:32.893470 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.145:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Dec 15 14:16:32 crc kubenswrapper[4794]: E1215 14:16:32.894143 4794 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.145:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Dec 15 14:16:32 crc kubenswrapper[4794]: E1215 14:16:32.894285 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-kuttl-db-sync,Image:38.102.83.145:5001/podified-master-centos10/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68flv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-kuttl-db-sync-rwqd8_watcher-kuttl-default(3defa926-87a9-4d6b-9af8-c1d9c591ca38): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 15 14:16:32 crc kubenswrapper[4794]: E1215 14:16:32.895708 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" podUID="3defa926-87a9-4d6b-9af8-c1d9c591ca38" Dec 15 14:16:33 crc kubenswrapper[4794]: I1215 14:16:33.745940 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098"} Dec 15 14:16:33 crc kubenswrapper[4794]: E1215 14:16:33.747312 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.145:5001/podified-master-centos10/openstack-watcher-api:watcher_latest\\\"\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" podUID="3defa926-87a9-4d6b-9af8-c1d9c591ca38" Dec 15 14:16:45 crc kubenswrapper[4794]: I1215 14:16:45.740820 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 15 14:16:46 crc kubenswrapper[4794]: I1215 14:16:46.872473 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" event={"ID":"3defa926-87a9-4d6b-9af8-c1d9c591ca38","Type":"ContainerStarted","Data":"16bce1b3c7c7ae605b4c5da5a921c6e06ee09cba862c3c611a5d02b91c6696d7"} Dec 15 14:16:49 crc kubenswrapper[4794]: I1215 14:16:49.908254 4794 generic.go:334] "Generic (PLEG): container finished" podID="3defa926-87a9-4d6b-9af8-c1d9c591ca38" containerID="16bce1b3c7c7ae605b4c5da5a921c6e06ee09cba862c3c611a5d02b91c6696d7" exitCode=0 Dec 15 14:16:49 crc kubenswrapper[4794]: I1215 14:16:49.908379 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" event={"ID":"3defa926-87a9-4d6b-9af8-c1d9c591ca38","Type":"ContainerDied","Data":"16bce1b3c7c7ae605b4c5da5a921c6e06ee09cba862c3c611a5d02b91c6696d7"} Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.243494 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.275473 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-config-data\") pod \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.275559 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-db-sync-config-data\") pod \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.275617 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-combined-ca-bundle\") pod \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.275649 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68flv\" (UniqueName: \"kubernetes.io/projected/3defa926-87a9-4d6b-9af8-c1d9c591ca38-kube-api-access-68flv\") pod \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\" (UID: \"3defa926-87a9-4d6b-9af8-c1d9c591ca38\") " Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.283104 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3defa926-87a9-4d6b-9af8-c1d9c591ca38" (UID: "3defa926-87a9-4d6b-9af8-c1d9c591ca38"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.283279 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3defa926-87a9-4d6b-9af8-c1d9c591ca38-kube-api-access-68flv" (OuterVolumeSpecName: "kube-api-access-68flv") pod "3defa926-87a9-4d6b-9af8-c1d9c591ca38" (UID: "3defa926-87a9-4d6b-9af8-c1d9c591ca38"). InnerVolumeSpecName "kube-api-access-68flv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.314179 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3defa926-87a9-4d6b-9af8-c1d9c591ca38" (UID: "3defa926-87a9-4d6b-9af8-c1d9c591ca38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.345086 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-config-data" (OuterVolumeSpecName: "config-data") pod "3defa926-87a9-4d6b-9af8-c1d9c591ca38" (UID: "3defa926-87a9-4d6b-9af8-c1d9c591ca38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.378811 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.378858 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.378878 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3defa926-87a9-4d6b-9af8-c1d9c591ca38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.378897 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68flv\" (UniqueName: \"kubernetes.io/projected/3defa926-87a9-4d6b-9af8-c1d9c591ca38-kube-api-access-68flv\") on node \"crc\" DevicePath \"\"" Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.927351 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" event={"ID":"3defa926-87a9-4d6b-9af8-c1d9c591ca38","Type":"ContainerDied","Data":"0a60b738e27de37aff465c147985257cb38424c6ce55471c0d8e98625f961b9e"} Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.927619 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a60b738e27de37aff465c147985257cb38424c6ce55471c0d8e98625f961b9e" Dec 15 14:16:51 crc kubenswrapper[4794]: I1215 14:16:51.927404 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.259715 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:16:52 crc kubenswrapper[4794]: E1215 14:16:52.260048 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3defa926-87a9-4d6b-9af8-c1d9c591ca38" containerName="watcher-kuttl-db-sync" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.260060 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3defa926-87a9-4d6b-9af8-c1d9c591ca38" containerName="watcher-kuttl-db-sync" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.260259 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3defa926-87a9-4d6b-9af8-c1d9c591ca38" containerName="watcher-kuttl-db-sync" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.261068 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.265034 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-sl9k7" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.268066 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.286017 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.287268 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.290694 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.293010 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.293057 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t72k\" (UniqueName: \"kubernetes.io/projected/a12ed886-0e2b-4acb-aa40-79fe186ff023-kube-api-access-4t72k\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.293170 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a12ed886-0e2b-4acb-aa40-79fe186ff023-logs\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.293199 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.293232 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.293781 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.303962 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.374478 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.375503 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.377232 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.389866 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394383 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394439 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgtfx\" (UniqueName: \"kubernetes.io/projected/1c204c98-7ab9-483e-bb80-f763162cfa19-kube-api-access-hgtfx\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394461 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c204c98-7ab9-483e-bb80-f763162cfa19-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394487 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x87dp\" (UniqueName: \"kubernetes.io/projected/d9b6786b-b051-4b68-9bb8-4fa711927c71-kube-api-access-x87dp\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394509 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394530 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t72k\" (UniqueName: \"kubernetes.io/projected/a12ed886-0e2b-4acb-aa40-79fe186ff023-kube-api-access-4t72k\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394594 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394623 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394642 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394683 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394704 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b6786b-b051-4b68-9bb8-4fa711927c71-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394786 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394872 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a12ed886-0e2b-4acb-aa40-79fe186ff023-logs\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.394888 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.395358 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a12ed886-0e2b-4acb-aa40-79fe186ff023-logs\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.400414 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.404076 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.415471 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t72k\" (UniqueName: \"kubernetes.io/projected/a12ed886-0e2b-4acb-aa40-79fe186ff023-kube-api-access-4t72k\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.417272 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.496754 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.497047 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.497164 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b6786b-b051-4b68-9bb8-4fa711927c71-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.497249 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.497352 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgtfx\" (UniqueName: \"kubernetes.io/projected/1c204c98-7ab9-483e-bb80-f763162cfa19-kube-api-access-hgtfx\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.497427 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c204c98-7ab9-483e-bb80-f763162cfa19-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.497515 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x87dp\" (UniqueName: \"kubernetes.io/projected/d9b6786b-b051-4b68-9bb8-4fa711927c71-kube-api-access-x87dp\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.497642 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.497745 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.497470 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b6786b-b051-4b68-9bb8-4fa711927c71-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.497798 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c204c98-7ab9-483e-bb80-f763162cfa19-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.500246 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.500378 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.500653 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.501573 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.508457 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.512321 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x87dp\" (UniqueName: \"kubernetes.io/projected/d9b6786b-b051-4b68-9bb8-4fa711927c71-kube-api-access-x87dp\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.513684 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgtfx\" (UniqueName: \"kubernetes.io/projected/1c204c98-7ab9-483e-bb80-f763162cfa19-kube-api-access-hgtfx\") pod \"watcher-kuttl-applier-0\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.578365 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.602805 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:16:52 crc kubenswrapper[4794]: I1215 14:16:52.688687 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:16:53 crc kubenswrapper[4794]: I1215 14:16:53.104332 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:16:53 crc kubenswrapper[4794]: I1215 14:16:53.170269 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:16:53 crc kubenswrapper[4794]: W1215 14:16:53.170422 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b6786b_b051_4b68_9bb8_4fa711927c71.slice/crio-d356d9cf89b1fc1ed53b71312fdf7a46d85a47dce830546fe78f1518c30b4934 WatchSource:0}: Error finding container d356d9cf89b1fc1ed53b71312fdf7a46d85a47dce830546fe78f1518c30b4934: Status 404 returned error can't find the container with id d356d9cf89b1fc1ed53b71312fdf7a46d85a47dce830546fe78f1518c30b4934 Dec 15 14:16:53 crc kubenswrapper[4794]: I1215 14:16:53.195407 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:16:53 crc kubenswrapper[4794]: I1215 14:16:53.948952 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d9b6786b-b051-4b68-9bb8-4fa711927c71","Type":"ContainerStarted","Data":"d356d9cf89b1fc1ed53b71312fdf7a46d85a47dce830546fe78f1518c30b4934"} Dec 15 14:16:53 crc kubenswrapper[4794]: I1215 14:16:53.952348 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a12ed886-0e2b-4acb-aa40-79fe186ff023","Type":"ContainerStarted","Data":"768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f"} Dec 15 14:16:53 crc kubenswrapper[4794]: I1215 14:16:53.952397 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a12ed886-0e2b-4acb-aa40-79fe186ff023","Type":"ContainerStarted","Data":"ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b"} Dec 15 14:16:53 crc kubenswrapper[4794]: I1215 14:16:53.952410 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a12ed886-0e2b-4acb-aa40-79fe186ff023","Type":"ContainerStarted","Data":"6360e3510171cd0356808136a91592d3536041c723f33fa835ab32dbcdcdabd1"} Dec 15 14:16:53 crc kubenswrapper[4794]: I1215 14:16:53.952607 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:53 crc kubenswrapper[4794]: I1215 14:16:53.953761 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c204c98-7ab9-483e-bb80-f763162cfa19","Type":"ContainerStarted","Data":"d41791f98e2cc4aa4fc3aaf70686e002fc0d8ae51a82c4ce7a18a8fc5ed87ce9"} Dec 15 14:16:53 crc kubenswrapper[4794]: I1215 14:16:53.977463 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=1.977447029 podStartE2EDuration="1.977447029s" podCreationTimestamp="2025-12-15 14:16:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:16:53.974837965 +0000 UTC m=+1375.826860433" watchObservedRunningTime="2025-12-15 14:16:53.977447029 +0000 UTC m=+1375.829469477" Dec 15 14:16:54 crc kubenswrapper[4794]: I1215 14:16:54.962695 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c204c98-7ab9-483e-bb80-f763162cfa19","Type":"ContainerStarted","Data":"d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9"} Dec 15 14:16:54 crc kubenswrapper[4794]: I1215 14:16:54.964222 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d9b6786b-b051-4b68-9bb8-4fa711927c71","Type":"ContainerStarted","Data":"55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30"} Dec 15 14:16:54 crc kubenswrapper[4794]: I1215 14:16:54.981746 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.9418353019999999 podStartE2EDuration="2.981721982s" podCreationTimestamp="2025-12-15 14:16:52 +0000 UTC" firstStartedPulling="2025-12-15 14:16:53.211395061 +0000 UTC m=+1375.063417499" lastFinishedPulling="2025-12-15 14:16:54.251281751 +0000 UTC m=+1376.103304179" observedRunningTime="2025-12-15 14:16:54.980896089 +0000 UTC m=+1376.832918547" watchObservedRunningTime="2025-12-15 14:16:54.981721982 +0000 UTC m=+1376.833744460" Dec 15 14:16:54 crc kubenswrapper[4794]: I1215 14:16:54.999782 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.918286017 podStartE2EDuration="2.999760742s" podCreationTimestamp="2025-12-15 14:16:52 +0000 UTC" firstStartedPulling="2025-12-15 14:16:53.175126596 +0000 UTC m=+1375.027149074" lastFinishedPulling="2025-12-15 14:16:54.256601361 +0000 UTC m=+1376.108623799" observedRunningTime="2025-12-15 14:16:54.997857478 +0000 UTC m=+1376.849879946" watchObservedRunningTime="2025-12-15 14:16:54.999760742 +0000 UTC m=+1376.851783190" Dec 15 14:16:56 crc kubenswrapper[4794]: I1215 14:16:56.177659 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:57 crc kubenswrapper[4794]: I1215 14:16:57.579333 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:16:57 crc kubenswrapper[4794]: I1215 14:16:57.689021 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:02 crc kubenswrapper[4794]: I1215 14:17:02.579174 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:02 crc kubenswrapper[4794]: I1215 14:17:02.584512 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:02 crc kubenswrapper[4794]: I1215 14:17:02.604232 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:02 crc kubenswrapper[4794]: I1215 14:17:02.634544 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:02 crc kubenswrapper[4794]: I1215 14:17:02.689398 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:02 crc kubenswrapper[4794]: I1215 14:17:02.711167 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:03 crc kubenswrapper[4794]: I1215 14:17:03.039230 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:03 crc kubenswrapper[4794]: I1215 14:17:03.048154 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:03 crc kubenswrapper[4794]: I1215 14:17:03.068475 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:03 crc kubenswrapper[4794]: I1215 14:17:03.091275 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:05 crc kubenswrapper[4794]: I1215 14:17:05.067011 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:05 crc kubenswrapper[4794]: I1215 14:17:05.067532 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="ceilometer-central-agent" containerID="cri-o://72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd" gracePeriod=30 Dec 15 14:17:05 crc kubenswrapper[4794]: I1215 14:17:05.067752 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="proxy-httpd" containerID="cri-o://68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b" gracePeriod=30 Dec 15 14:17:05 crc kubenswrapper[4794]: I1215 14:17:05.067803 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="sg-core" containerID="cri-o://6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5" gracePeriod=30 Dec 15 14:17:05 crc kubenswrapper[4794]: I1215 14:17:05.067836 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="ceilometer-notification-agent" containerID="cri-o://6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89" gracePeriod=30 Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.066200 4794 generic.go:334] "Generic (PLEG): container finished" podID="29590762-7864-464f-984f-4626c6566a08" containerID="68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b" exitCode=0 Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.066375 4794 generic.go:334] "Generic (PLEG): container finished" podID="29590762-7864-464f-984f-4626c6566a08" containerID="6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5" exitCode=2 Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.066383 4794 generic.go:334] "Generic (PLEG): container finished" podID="29590762-7864-464f-984f-4626c6566a08" containerID="72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd" exitCode=0 Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.066402 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"29590762-7864-464f-984f-4626c6566a08","Type":"ContainerDied","Data":"68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b"} Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.066426 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"29590762-7864-464f-984f-4626c6566a08","Type":"ContainerDied","Data":"6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5"} Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.066436 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"29590762-7864-464f-984f-4626c6566a08","Type":"ContainerDied","Data":"72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd"} Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.203737 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8"] Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.211420 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-rwqd8"] Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.237305 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher258e-account-delete-jzw64"] Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.238358 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher258e-account-delete-jzw64" Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.250722 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher258e-account-delete-jzw64"] Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.258797 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.259021 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="d9b6786b-b051-4b68-9bb8-4fa711927c71" containerName="watcher-decision-engine" containerID="cri-o://55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30" gracePeriod=30 Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.312852 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.313046 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="a12ed886-0e2b-4acb-aa40-79fe186ff023" containerName="watcher-kuttl-api-log" containerID="cri-o://ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b" gracePeriod=30 Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.313387 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="a12ed886-0e2b-4acb-aa40-79fe186ff023" containerName="watcher-api" containerID="cri-o://768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f" gracePeriod=30 Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.350477 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj94v\" (UniqueName: \"kubernetes.io/projected/9c8042d6-36fa-4b7e-b7dc-280bd5621712-kube-api-access-hj94v\") pod \"watcher258e-account-delete-jzw64\" (UID: \"9c8042d6-36fa-4b7e-b7dc-280bd5621712\") " pod="watcher-kuttl-default/watcher258e-account-delete-jzw64" Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.367983 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.368192 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="1c204c98-7ab9-483e-bb80-f763162cfa19" containerName="watcher-applier" containerID="cri-o://d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9" gracePeriod=30 Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.451858 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj94v\" (UniqueName: \"kubernetes.io/projected/9c8042d6-36fa-4b7e-b7dc-280bd5621712-kube-api-access-hj94v\") pod \"watcher258e-account-delete-jzw64\" (UID: \"9c8042d6-36fa-4b7e-b7dc-280bd5621712\") " pod="watcher-kuttl-default/watcher258e-account-delete-jzw64" Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.470311 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj94v\" (UniqueName: \"kubernetes.io/projected/9c8042d6-36fa-4b7e-b7dc-280bd5621712-kube-api-access-hj94v\") pod \"watcher258e-account-delete-jzw64\" (UID: \"9c8042d6-36fa-4b7e-b7dc-280bd5621712\") " pod="watcher-kuttl-default/watcher258e-account-delete-jzw64" Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.553519 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher258e-account-delete-jzw64" Dec 15 14:17:06 crc kubenswrapper[4794]: I1215 14:17:06.747635 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3defa926-87a9-4d6b-9af8-c1d9c591ca38" path="/var/lib/kubelet/pods/3defa926-87a9-4d6b-9af8-c1d9c591ca38/volumes" Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.006703 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher258e-account-delete-jzw64"] Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.076594 4794 generic.go:334] "Generic (PLEG): container finished" podID="a12ed886-0e2b-4acb-aa40-79fe186ff023" containerID="ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b" exitCode=143 Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.076654 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a12ed886-0e2b-4acb-aa40-79fe186ff023","Type":"ContainerDied","Data":"ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b"} Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.077795 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher258e-account-delete-jzw64" event={"ID":"9c8042d6-36fa-4b7e-b7dc-280bd5621712","Type":"ContainerStarted","Data":"18b5efe54ea5e5dc2465d85c113975b538f7715589cf021db0c582a4e43ccdf2"} Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.566106 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.669826 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-custom-prometheus-ca\") pod \"a12ed886-0e2b-4acb-aa40-79fe186ff023\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.669875 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-config-data\") pod \"a12ed886-0e2b-4acb-aa40-79fe186ff023\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.669908 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a12ed886-0e2b-4acb-aa40-79fe186ff023-logs\") pod \"a12ed886-0e2b-4acb-aa40-79fe186ff023\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.669951 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t72k\" (UniqueName: \"kubernetes.io/projected/a12ed886-0e2b-4acb-aa40-79fe186ff023-kube-api-access-4t72k\") pod \"a12ed886-0e2b-4acb-aa40-79fe186ff023\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.669999 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-combined-ca-bundle\") pod \"a12ed886-0e2b-4acb-aa40-79fe186ff023\" (UID: \"a12ed886-0e2b-4acb-aa40-79fe186ff023\") " Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.670755 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a12ed886-0e2b-4acb-aa40-79fe186ff023-logs" (OuterVolumeSpecName: "logs") pod "a12ed886-0e2b-4acb-aa40-79fe186ff023" (UID: "a12ed886-0e2b-4acb-aa40-79fe186ff023"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.676256 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a12ed886-0e2b-4acb-aa40-79fe186ff023-kube-api-access-4t72k" (OuterVolumeSpecName: "kube-api-access-4t72k") pod "a12ed886-0e2b-4acb-aa40-79fe186ff023" (UID: "a12ed886-0e2b-4acb-aa40-79fe186ff023"). InnerVolumeSpecName "kube-api-access-4t72k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:07 crc kubenswrapper[4794]: E1215 14:17:07.691863 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:17:07 crc kubenswrapper[4794]: E1215 14:17:07.693368 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:17:07 crc kubenswrapper[4794]: E1215 14:17:07.695826 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:17:07 crc kubenswrapper[4794]: E1215 14:17:07.695867 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="1c204c98-7ab9-483e-bb80-f763162cfa19" containerName="watcher-applier" Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.709776 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "a12ed886-0e2b-4acb-aa40-79fe186ff023" (UID: "a12ed886-0e2b-4acb-aa40-79fe186ff023"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.722496 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-config-data" (OuterVolumeSpecName: "config-data") pod "a12ed886-0e2b-4acb-aa40-79fe186ff023" (UID: "a12ed886-0e2b-4acb-aa40-79fe186ff023"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.728761 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a12ed886-0e2b-4acb-aa40-79fe186ff023" (UID: "a12ed886-0e2b-4acb-aa40-79fe186ff023"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.771786 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.771814 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.771823 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12ed886-0e2b-4acb-aa40-79fe186ff023-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.771831 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a12ed886-0e2b-4acb-aa40-79fe186ff023-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:07 crc kubenswrapper[4794]: I1215 14:17:07.771840 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t72k\" (UniqueName: \"kubernetes.io/projected/a12ed886-0e2b-4acb-aa40-79fe186ff023-kube-api-access-4t72k\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.086088 4794 generic.go:334] "Generic (PLEG): container finished" podID="a12ed886-0e2b-4acb-aa40-79fe186ff023" containerID="768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f" exitCode=0 Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.086155 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.086184 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a12ed886-0e2b-4acb-aa40-79fe186ff023","Type":"ContainerDied","Data":"768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f"} Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.086231 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"a12ed886-0e2b-4acb-aa40-79fe186ff023","Type":"ContainerDied","Data":"6360e3510171cd0356808136a91592d3536041c723f33fa835ab32dbcdcdabd1"} Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.086249 4794 scope.go:117] "RemoveContainer" containerID="768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f" Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.094438 4794 generic.go:334] "Generic (PLEG): container finished" podID="9c8042d6-36fa-4b7e-b7dc-280bd5621712" containerID="d95f3ca81c1d9be5aa62596822040202489e82ecb3a64fea78a2e03e3f9e4d4d" exitCode=0 Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.094507 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher258e-account-delete-jzw64" event={"ID":"9c8042d6-36fa-4b7e-b7dc-280bd5621712","Type":"ContainerDied","Data":"d95f3ca81c1d9be5aa62596822040202489e82ecb3a64fea78a2e03e3f9e4d4d"} Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.119291 4794 scope.go:117] "RemoveContainer" containerID="ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b" Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.137629 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.142748 4794 scope.go:117] "RemoveContainer" containerID="768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f" Dec 15 14:17:08 crc kubenswrapper[4794]: E1215 14:17:08.143243 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f\": container with ID starting with 768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f not found: ID does not exist" containerID="768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f" Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.143270 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f"} err="failed to get container status \"768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f\": rpc error: code = NotFound desc = could not find container \"768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f\": container with ID starting with 768571dc812495a10b4a823ba69ba7798d622948fee842e4cbfe54959e19cd6f not found: ID does not exist" Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.143291 4794 scope.go:117] "RemoveContainer" containerID="ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b" Dec 15 14:17:08 crc kubenswrapper[4794]: E1215 14:17:08.144940 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b\": container with ID starting with ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b not found: ID does not exist" containerID="ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b" Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.145072 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b"} err="failed to get container status \"ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b\": rpc error: code = NotFound desc = could not find container \"ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b\": container with ID starting with ff7070cebc94e68625ed8757cd6677808aa0ee31fcf82a66dfc7d3746065b22b not found: ID does not exist" Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.148388 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:17:08 crc kubenswrapper[4794]: I1215 14:17:08.753526 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a12ed886-0e2b-4acb-aa40-79fe186ff023" path="/var/lib/kubelet/pods/a12ed886-0e2b-4acb-aa40-79fe186ff023/volumes" Dec 15 14:17:09 crc kubenswrapper[4794]: I1215 14:17:09.520360 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher258e-account-delete-jzw64" Dec 15 14:17:09 crc kubenswrapper[4794]: I1215 14:17:09.597533 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj94v\" (UniqueName: \"kubernetes.io/projected/9c8042d6-36fa-4b7e-b7dc-280bd5621712-kube-api-access-hj94v\") pod \"9c8042d6-36fa-4b7e-b7dc-280bd5621712\" (UID: \"9c8042d6-36fa-4b7e-b7dc-280bd5621712\") " Dec 15 14:17:09 crc kubenswrapper[4794]: I1215 14:17:09.601889 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c8042d6-36fa-4b7e-b7dc-280bd5621712-kube-api-access-hj94v" (OuterVolumeSpecName: "kube-api-access-hj94v") pod "9c8042d6-36fa-4b7e-b7dc-280bd5621712" (UID: "9c8042d6-36fa-4b7e-b7dc-280bd5621712"). InnerVolumeSpecName "kube-api-access-hj94v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:09 crc kubenswrapper[4794]: I1215 14:17:09.698890 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj94v\" (UniqueName: \"kubernetes.io/projected/9c8042d6-36fa-4b7e-b7dc-280bd5621712-kube-api-access-hj94v\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.026824 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.103179 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wssnj\" (UniqueName: \"kubernetes.io/projected/29590762-7864-464f-984f-4626c6566a08-kube-api-access-wssnj\") pod \"29590762-7864-464f-984f-4626c6566a08\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.103569 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-log-httpd\") pod \"29590762-7864-464f-984f-4626c6566a08\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.103703 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-config-data\") pod \"29590762-7864-464f-984f-4626c6566a08\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.103747 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-ceilometer-tls-certs\") pod \"29590762-7864-464f-984f-4626c6566a08\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.103802 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-combined-ca-bundle\") pod \"29590762-7864-464f-984f-4626c6566a08\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.104100 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "29590762-7864-464f-984f-4626c6566a08" (UID: "29590762-7864-464f-984f-4626c6566a08"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.104281 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-scripts\") pod \"29590762-7864-464f-984f-4626c6566a08\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.104361 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-run-httpd\") pod \"29590762-7864-464f-984f-4626c6566a08\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.104428 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-sg-core-conf-yaml\") pod \"29590762-7864-464f-984f-4626c6566a08\" (UID: \"29590762-7864-464f-984f-4626c6566a08\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.104921 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.105050 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "29590762-7864-464f-984f-4626c6566a08" (UID: "29590762-7864-464f-984f-4626c6566a08"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.107363 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29590762-7864-464f-984f-4626c6566a08-kube-api-access-wssnj" (OuterVolumeSpecName: "kube-api-access-wssnj") pod "29590762-7864-464f-984f-4626c6566a08" (UID: "29590762-7864-464f-984f-4626c6566a08"). InnerVolumeSpecName "kube-api-access-wssnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.107566 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-scripts" (OuterVolumeSpecName: "scripts") pod "29590762-7864-464f-984f-4626c6566a08" (UID: "29590762-7864-464f-984f-4626c6566a08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.117946 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher258e-account-delete-jzw64" event={"ID":"9c8042d6-36fa-4b7e-b7dc-280bd5621712","Type":"ContainerDied","Data":"18b5efe54ea5e5dc2465d85c113975b538f7715589cf021db0c582a4e43ccdf2"} Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.118115 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18b5efe54ea5e5dc2465d85c113975b538f7715589cf021db0c582a4e43ccdf2" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.118138 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher258e-account-delete-jzw64" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.120901 4794 generic.go:334] "Generic (PLEG): container finished" podID="29590762-7864-464f-984f-4626c6566a08" containerID="6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89" exitCode=0 Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.120933 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"29590762-7864-464f-984f-4626c6566a08","Type":"ContainerDied","Data":"6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89"} Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.120954 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"29590762-7864-464f-984f-4626c6566a08","Type":"ContainerDied","Data":"5d0e4c2df76ceade9cf852f1f009d74c9421823ee41791af51d93180a23c951d"} Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.120976 4794 scope.go:117] "RemoveContainer" containerID="68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.121084 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.135981 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "29590762-7864-464f-984f-4626c6566a08" (UID: "29590762-7864-464f-984f-4626c6566a08"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.140027 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "29590762-7864-464f-984f-4626c6566a08" (UID: "29590762-7864-464f-984f-4626c6566a08"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.150463 4794 scope.go:117] "RemoveContainer" containerID="6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.174021 4794 scope.go:117] "RemoveContainer" containerID="6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.188348 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29590762-7864-464f-984f-4626c6566a08" (UID: "29590762-7864-464f-984f-4626c6566a08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.192602 4794 scope.go:117] "RemoveContainer" containerID="72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd" Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.193795 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c8042d6_36fa_4b7e_b7dc_280bd5621712.slice/crio-18b5efe54ea5e5dc2465d85c113975b538f7715589cf021db0c582a4e43ccdf2\": RecentStats: unable to find data in memory cache]" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.206230 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wssnj\" (UniqueName: \"kubernetes.io/projected/29590762-7864-464f-984f-4626c6566a08-kube-api-access-wssnj\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.206339 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.206411 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.206467 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.206520 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29590762-7864-464f-984f-4626c6566a08-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.206572 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.214536 4794 scope.go:117] "RemoveContainer" containerID="68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b" Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.215021 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b\": container with ID starting with 68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b not found: ID does not exist" containerID="68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.215052 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b"} err="failed to get container status \"68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b\": rpc error: code = NotFound desc = could not find container \"68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b\": container with ID starting with 68520e1cce586b78ac395738e815f0a6f9a9844cbd50fb0dc25745cc0247087b not found: ID does not exist" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.215073 4794 scope.go:117] "RemoveContainer" containerID="6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5" Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.215354 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5\": container with ID starting with 6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5 not found: ID does not exist" containerID="6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.215386 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5"} err="failed to get container status \"6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5\": rpc error: code = NotFound desc = could not find container \"6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5\": container with ID starting with 6b2c3f48d0f2001143b9d25d005999ec40bd67cabdaef3311bd4fa50fc2f38f5 not found: ID does not exist" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.215420 4794 scope.go:117] "RemoveContainer" containerID="6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89" Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.215847 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89\": container with ID starting with 6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89 not found: ID does not exist" containerID="6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.215887 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89"} err="failed to get container status \"6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89\": rpc error: code = NotFound desc = could not find container \"6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89\": container with ID starting with 6ec7f2cd3d492ae2b7d2e18cf9be6523aa92ba091824f0284e57b822f288ba89 not found: ID does not exist" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.215911 4794 scope.go:117] "RemoveContainer" containerID="72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd" Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.216272 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd\": container with ID starting with 72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd not found: ID does not exist" containerID="72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.216296 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd"} err="failed to get container status \"72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd\": rpc error: code = NotFound desc = could not find container \"72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd\": container with ID starting with 72c396ef6d430785748ca513278bc9de42c1cccb1eda1ce8afec3e3abf94e5bd not found: ID does not exist" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.247805 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-config-data" (OuterVolumeSpecName: "config-data") pod "29590762-7864-464f-984f-4626c6566a08" (UID: "29590762-7864-464f-984f-4626c6566a08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.307566 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29590762-7864-464f-984f-4626c6566a08-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.503642 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.517820 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.532636 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.533044 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="sg-core" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533062 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="sg-core" Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.533109 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12ed886-0e2b-4acb-aa40-79fe186ff023" containerName="watcher-api" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533116 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12ed886-0e2b-4acb-aa40-79fe186ff023" containerName="watcher-api" Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.533131 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="ceilometer-central-agent" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533137 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="ceilometer-central-agent" Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.533147 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="ceilometer-notification-agent" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533152 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="ceilometer-notification-agent" Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.533161 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c8042d6-36fa-4b7e-b7dc-280bd5621712" containerName="mariadb-account-delete" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533168 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c8042d6-36fa-4b7e-b7dc-280bd5621712" containerName="mariadb-account-delete" Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.533178 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="proxy-httpd" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533184 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="proxy-httpd" Dec 15 14:17:10 crc kubenswrapper[4794]: E1215 14:17:10.533193 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12ed886-0e2b-4acb-aa40-79fe186ff023" containerName="watcher-kuttl-api-log" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533199 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12ed886-0e2b-4acb-aa40-79fe186ff023" containerName="watcher-kuttl-api-log" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533365 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12ed886-0e2b-4acb-aa40-79fe186ff023" containerName="watcher-kuttl-api-log" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533383 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c8042d6-36fa-4b7e-b7dc-280bd5621712" containerName="mariadb-account-delete" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533394 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="proxy-httpd" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533407 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="ceilometer-notification-agent" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533419 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="ceilometer-central-agent" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533428 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12ed886-0e2b-4acb-aa40-79fe186ff023" containerName="watcher-api" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.533438 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="29590762-7864-464f-984f-4626c6566a08" containerName="sg-core" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.534975 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.540672 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.543731 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.544172 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.545656 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.614680 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.614769 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-run-httpd\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.614806 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-config-data\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.614928 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.615073 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-scripts\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.615169 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-log-httpd\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.615266 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9pmb\" (UniqueName: \"kubernetes.io/projected/107182a6-ee54-4ab4-8da4-fbc82912954c-kube-api-access-v9pmb\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.615393 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.664661 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716448 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-config-data\") pod \"1c204c98-7ab9-483e-bb80-f763162cfa19\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716493 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c204c98-7ab9-483e-bb80-f763162cfa19-logs\") pod \"1c204c98-7ab9-483e-bb80-f763162cfa19\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716526 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgtfx\" (UniqueName: \"kubernetes.io/projected/1c204c98-7ab9-483e-bb80-f763162cfa19-kube-api-access-hgtfx\") pod \"1c204c98-7ab9-483e-bb80-f763162cfa19\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716652 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-combined-ca-bundle\") pod \"1c204c98-7ab9-483e-bb80-f763162cfa19\" (UID: \"1c204c98-7ab9-483e-bb80-f763162cfa19\") " Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716768 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716804 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-scripts\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716827 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-log-httpd\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716859 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9pmb\" (UniqueName: \"kubernetes.io/projected/107182a6-ee54-4ab4-8da4-fbc82912954c-kube-api-access-v9pmb\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716888 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716936 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716951 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-run-httpd\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.716968 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-config-data\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.717061 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c204c98-7ab9-483e-bb80-f763162cfa19-logs" (OuterVolumeSpecName: "logs") pod "1c204c98-7ab9-483e-bb80-f763162cfa19" (UID: "1c204c98-7ab9-483e-bb80-f763162cfa19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.717511 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-log-httpd\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.721096 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-run-httpd\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.721594 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.722995 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-config-data\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.723105 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-scripts\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.723822 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c204c98-7ab9-483e-bb80-f763162cfa19-kube-api-access-hgtfx" (OuterVolumeSpecName: "kube-api-access-hgtfx") pod "1c204c98-7ab9-483e-bb80-f763162cfa19" (UID: "1c204c98-7ab9-483e-bb80-f763162cfa19"). InnerVolumeSpecName "kube-api-access-hgtfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.735079 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.737174 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.739328 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9pmb\" (UniqueName: \"kubernetes.io/projected/107182a6-ee54-4ab4-8da4-fbc82912954c-kube-api-access-v9pmb\") pod \"ceilometer-0\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.744725 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c204c98-7ab9-483e-bb80-f763162cfa19" (UID: "1c204c98-7ab9-483e-bb80-f763162cfa19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.760744 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29590762-7864-464f-984f-4626c6566a08" path="/var/lib/kubelet/pods/29590762-7864-464f-984f-4626c6566a08/volumes" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.771063 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-config-data" (OuterVolumeSpecName: "config-data") pod "1c204c98-7ab9-483e-bb80-f763162cfa19" (UID: "1c204c98-7ab9-483e-bb80-f763162cfa19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.818979 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.819012 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c204c98-7ab9-483e-bb80-f763162cfa19-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.819030 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c204c98-7ab9-483e-bb80-f763162cfa19-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.819041 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgtfx\" (UniqueName: \"kubernetes.io/projected/1c204c98-7ab9-483e-bb80-f763162cfa19-kube-api-access-hgtfx\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:10 crc kubenswrapper[4794]: I1215 14:17:10.869437 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.130920 4794 generic.go:334] "Generic (PLEG): container finished" podID="1c204c98-7ab9-483e-bb80-f763162cfa19" containerID="d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9" exitCode=0 Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.131040 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.131030 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c204c98-7ab9-483e-bb80-f763162cfa19","Type":"ContainerDied","Data":"d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9"} Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.131406 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c204c98-7ab9-483e-bb80-f763162cfa19","Type":"ContainerDied","Data":"d41791f98e2cc4aa4fc3aaf70686e002fc0d8ae51a82c4ce7a18a8fc5ed87ce9"} Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.131437 4794 scope.go:117] "RemoveContainer" containerID="d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9" Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.155538 4794 scope.go:117] "RemoveContainer" containerID="d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9" Dec 15 14:17:11 crc kubenswrapper[4794]: E1215 14:17:11.155962 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9\": container with ID starting with d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9 not found: ID does not exist" containerID="d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9" Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.156010 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9"} err="failed to get container status \"d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9\": rpc error: code = NotFound desc = could not find container \"d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9\": container with ID starting with d030565acb44999a3ee665334a702fd77d090175827965ad22519827cde0deb9 not found: ID does not exist" Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.160917 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.167401 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.273721 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-d2tl4"] Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.280972 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-d2tl4"] Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.288483 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher258e-account-delete-jzw64"] Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.295157 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-258e-account-create-tk96s"] Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.302687 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher258e-account-delete-jzw64"] Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.309244 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-258e-account-create-tk96s"] Dec 15 14:17:11 crc kubenswrapper[4794]: W1215 14:17:11.332178 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod107182a6_ee54_4ab4_8da4_fbc82912954c.slice/crio-66ae898ec316d3133501a3f96d0a6138224dfefbd124862492eb9e977c672717 WatchSource:0}: Error finding container 66ae898ec316d3133501a3f96d0a6138224dfefbd124862492eb9e977c672717: Status 404 returned error can't find the container with id 66ae898ec316d3133501a3f96d0a6138224dfefbd124862492eb9e977c672717 Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.337610 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:11 crc kubenswrapper[4794]: I1215 14:17:11.970758 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.045191 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-custom-prometheus-ca\") pod \"d9b6786b-b051-4b68-9bb8-4fa711927c71\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.045243 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b6786b-b051-4b68-9bb8-4fa711927c71-logs\") pod \"d9b6786b-b051-4b68-9bb8-4fa711927c71\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.045364 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x87dp\" (UniqueName: \"kubernetes.io/projected/d9b6786b-b051-4b68-9bb8-4fa711927c71-kube-api-access-x87dp\") pod \"d9b6786b-b051-4b68-9bb8-4fa711927c71\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.045482 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-config-data\") pod \"d9b6786b-b051-4b68-9bb8-4fa711927c71\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.045525 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-combined-ca-bundle\") pod \"d9b6786b-b051-4b68-9bb8-4fa711927c71\" (UID: \"d9b6786b-b051-4b68-9bb8-4fa711927c71\") " Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.045854 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b6786b-b051-4b68-9bb8-4fa711927c71-logs" (OuterVolumeSpecName: "logs") pod "d9b6786b-b051-4b68-9bb8-4fa711927c71" (UID: "d9b6786b-b051-4b68-9bb8-4fa711927c71"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.050379 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b6786b-b051-4b68-9bb8-4fa711927c71-kube-api-access-x87dp" (OuterVolumeSpecName: "kube-api-access-x87dp") pod "d9b6786b-b051-4b68-9bb8-4fa711927c71" (UID: "d9b6786b-b051-4b68-9bb8-4fa711927c71"). InnerVolumeSpecName "kube-api-access-x87dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.069954 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "d9b6786b-b051-4b68-9bb8-4fa711927c71" (UID: "d9b6786b-b051-4b68-9bb8-4fa711927c71"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.074239 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9b6786b-b051-4b68-9bb8-4fa711927c71" (UID: "d9b6786b-b051-4b68-9bb8-4fa711927c71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.099980 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-config-data" (OuterVolumeSpecName: "config-data") pod "d9b6786b-b051-4b68-9bb8-4fa711927c71" (UID: "d9b6786b-b051-4b68-9bb8-4fa711927c71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.140922 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"107182a6-ee54-4ab4-8da4-fbc82912954c","Type":"ContainerStarted","Data":"f2f13d4598186a15ef71ef82b148daedd6320a061ef8537700cd6be39ad72734"} Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.140976 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"107182a6-ee54-4ab4-8da4-fbc82912954c","Type":"ContainerStarted","Data":"66ae898ec316d3133501a3f96d0a6138224dfefbd124862492eb9e977c672717"} Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.143179 4794 generic.go:334] "Generic (PLEG): container finished" podID="d9b6786b-b051-4b68-9bb8-4fa711927c71" containerID="55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30" exitCode=0 Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.143248 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d9b6786b-b051-4b68-9bb8-4fa711927c71","Type":"ContainerDied","Data":"55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30"} Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.143274 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d9b6786b-b051-4b68-9bb8-4fa711927c71","Type":"ContainerDied","Data":"d356d9cf89b1fc1ed53b71312fdf7a46d85a47dce830546fe78f1518c30b4934"} Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.143281 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.143291 4794 scope.go:117] "RemoveContainer" containerID="55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.146843 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.146859 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.146868 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d9b6786b-b051-4b68-9bb8-4fa711927c71-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.146877 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b6786b-b051-4b68-9bb8-4fa711927c71-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.146885 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x87dp\" (UniqueName: \"kubernetes.io/projected/d9b6786b-b051-4b68-9bb8-4fa711927c71-kube-api-access-x87dp\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.178554 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.182301 4794 scope.go:117] "RemoveContainer" containerID="55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30" Dec 15 14:17:12 crc kubenswrapper[4794]: E1215 14:17:12.182876 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30\": container with ID starting with 55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30 not found: ID does not exist" containerID="55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.182923 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30"} err="failed to get container status \"55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30\": rpc error: code = NotFound desc = could not find container \"55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30\": container with ID starting with 55eb8b876aaa88d224c06afa318faf28a1a3973b4a81a5e8717143d743977e30 not found: ID does not exist" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.188251 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.750847 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c204c98-7ab9-483e-bb80-f763162cfa19" path="/var/lib/kubelet/pods/1c204c98-7ab9-483e-bb80-f763162cfa19/volumes" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.751549 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3412d7d8-45cd-4978-93be-103f1605bc49" path="/var/lib/kubelet/pods/3412d7d8-45cd-4978-93be-103f1605bc49/volumes" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.752193 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c8042d6-36fa-4b7e-b7dc-280bd5621712" path="/var/lib/kubelet/pods/9c8042d6-36fa-4b7e-b7dc-280bd5621712/volumes" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.752843 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9717ec4-73dc-4eea-8d0b-69e55280c21f" path="/var/lib/kubelet/pods/d9717ec4-73dc-4eea-8d0b-69e55280c21f/volumes" Dec 15 14:17:12 crc kubenswrapper[4794]: I1215 14:17:12.759341 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b6786b-b051-4b68-9bb8-4fa711927c71" path="/var/lib/kubelet/pods/d9b6786b-b051-4b68-9bb8-4fa711927c71/volumes" Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.153624 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"107182a6-ee54-4ab4-8da4-fbc82912954c","Type":"ContainerStarted","Data":"287e1e480948b9a4a7f73ab232646f004e55102f82d44f78aa22b2c00e73b541"} Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.371003 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-cns42"] Dec 15 14:17:13 crc kubenswrapper[4794]: E1215 14:17:13.371324 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b6786b-b051-4b68-9bb8-4fa711927c71" containerName="watcher-decision-engine" Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.371342 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b6786b-b051-4b68-9bb8-4fa711927c71" containerName="watcher-decision-engine" Dec 15 14:17:13 crc kubenswrapper[4794]: E1215 14:17:13.371359 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c204c98-7ab9-483e-bb80-f763162cfa19" containerName="watcher-applier" Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.371365 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c204c98-7ab9-483e-bb80-f763162cfa19" containerName="watcher-applier" Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.371561 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b6786b-b051-4b68-9bb8-4fa711927c71" containerName="watcher-decision-engine" Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.371594 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c204c98-7ab9-483e-bb80-f763162cfa19" containerName="watcher-applier" Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.372187 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-cns42" Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.394378 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-cns42"] Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.478526 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrlf8\" (UniqueName: \"kubernetes.io/projected/3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0-kube-api-access-zrlf8\") pod \"watcher-db-create-cns42\" (UID: \"3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0\") " pod="watcher-kuttl-default/watcher-db-create-cns42" Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.580229 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrlf8\" (UniqueName: \"kubernetes.io/projected/3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0-kube-api-access-zrlf8\") pod \"watcher-db-create-cns42\" (UID: \"3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0\") " pod="watcher-kuttl-default/watcher-db-create-cns42" Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.598434 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrlf8\" (UniqueName: \"kubernetes.io/projected/3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0-kube-api-access-zrlf8\") pod \"watcher-db-create-cns42\" (UID: \"3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0\") " pod="watcher-kuttl-default/watcher-db-create-cns42" Dec 15 14:17:13 crc kubenswrapper[4794]: I1215 14:17:13.689052 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-cns42" Dec 15 14:17:14 crc kubenswrapper[4794]: I1215 14:17:14.125271 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-cns42"] Dec 15 14:17:14 crc kubenswrapper[4794]: I1215 14:17:14.163624 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"107182a6-ee54-4ab4-8da4-fbc82912954c","Type":"ContainerStarted","Data":"71a38b5614c39ed2e3cbebdd6ff1110fe91f167880bf587807581b783339c528"} Dec 15 14:17:14 crc kubenswrapper[4794]: I1215 14:17:14.164642 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-cns42" event={"ID":"3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0","Type":"ContainerStarted","Data":"387a4b9aed569bb1072e60608b1f70c02dc9b9124261ff95fd3e097c3b03a8fa"} Dec 15 14:17:15 crc kubenswrapper[4794]: I1215 14:17:15.178094 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"107182a6-ee54-4ab4-8da4-fbc82912954c","Type":"ContainerStarted","Data":"221a177c183b66018de199e76eebda8cf275e0c7570cb5807add6f5df4db4a11"} Dec 15 14:17:15 crc kubenswrapper[4794]: I1215 14:17:15.178435 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:15 crc kubenswrapper[4794]: I1215 14:17:15.180432 4794 generic.go:334] "Generic (PLEG): container finished" podID="3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0" containerID="68905e4811cd3e0a006919c95853600fa8465a774793d694a28b9057cc5f7b39" exitCode=0 Dec 15 14:17:15 crc kubenswrapper[4794]: I1215 14:17:15.180492 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-cns42" event={"ID":"3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0","Type":"ContainerDied","Data":"68905e4811cd3e0a006919c95853600fa8465a774793d694a28b9057cc5f7b39"} Dec 15 14:17:15 crc kubenswrapper[4794]: I1215 14:17:15.211618 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.790623945 podStartE2EDuration="5.211577251s" podCreationTimestamp="2025-12-15 14:17:10 +0000 UTC" firstStartedPulling="2025-12-15 14:17:11.335745285 +0000 UTC m=+1393.187767743" lastFinishedPulling="2025-12-15 14:17:14.756698601 +0000 UTC m=+1396.608721049" observedRunningTime="2025-12-15 14:17:15.206884739 +0000 UTC m=+1397.058907207" watchObservedRunningTime="2025-12-15 14:17:15.211577251 +0000 UTC m=+1397.063599699" Dec 15 14:17:16 crc kubenswrapper[4794]: I1215 14:17:16.557407 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-cns42" Dec 15 14:17:16 crc kubenswrapper[4794]: I1215 14:17:16.625726 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrlf8\" (UniqueName: \"kubernetes.io/projected/3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0-kube-api-access-zrlf8\") pod \"3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0\" (UID: \"3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0\") " Dec 15 14:17:16 crc kubenswrapper[4794]: I1215 14:17:16.630551 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0-kube-api-access-zrlf8" (OuterVolumeSpecName: "kube-api-access-zrlf8") pod "3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0" (UID: "3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0"). InnerVolumeSpecName "kube-api-access-zrlf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:16 crc kubenswrapper[4794]: I1215 14:17:16.728245 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrlf8\" (UniqueName: \"kubernetes.io/projected/3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0-kube-api-access-zrlf8\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:17 crc kubenswrapper[4794]: I1215 14:17:17.200628 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-cns42" event={"ID":"3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0","Type":"ContainerDied","Data":"387a4b9aed569bb1072e60608b1f70c02dc9b9124261ff95fd3e097c3b03a8fa"} Dec 15 14:17:17 crc kubenswrapper[4794]: I1215 14:17:17.200975 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="387a4b9aed569bb1072e60608b1f70c02dc9b9124261ff95fd3e097c3b03a8fa" Dec 15 14:17:17 crc kubenswrapper[4794]: I1215 14:17:17.201068 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-cns42" Dec 15 14:17:23 crc kubenswrapper[4794]: I1215 14:17:23.391972 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-b5f7-account-create-8nkls"] Dec 15 14:17:23 crc kubenswrapper[4794]: E1215 14:17:23.392772 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0" containerName="mariadb-database-create" Dec 15 14:17:23 crc kubenswrapper[4794]: I1215 14:17:23.392783 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0" containerName="mariadb-database-create" Dec 15 14:17:23 crc kubenswrapper[4794]: I1215 14:17:23.392956 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0" containerName="mariadb-database-create" Dec 15 14:17:23 crc kubenswrapper[4794]: I1215 14:17:23.393573 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" Dec 15 14:17:23 crc kubenswrapper[4794]: I1215 14:17:23.396450 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 15 14:17:23 crc kubenswrapper[4794]: I1215 14:17:23.403188 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-b5f7-account-create-8nkls"] Dec 15 14:17:23 crc kubenswrapper[4794]: I1215 14:17:23.430700 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnplf\" (UniqueName: \"kubernetes.io/projected/49f4d206-cdfb-4963-9fe6-14153f253725-kube-api-access-xnplf\") pod \"watcher-b5f7-account-create-8nkls\" (UID: \"49f4d206-cdfb-4963-9fe6-14153f253725\") " pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" Dec 15 14:17:23 crc kubenswrapper[4794]: I1215 14:17:23.531571 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnplf\" (UniqueName: \"kubernetes.io/projected/49f4d206-cdfb-4963-9fe6-14153f253725-kube-api-access-xnplf\") pod \"watcher-b5f7-account-create-8nkls\" (UID: \"49f4d206-cdfb-4963-9fe6-14153f253725\") " pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" Dec 15 14:17:23 crc kubenswrapper[4794]: I1215 14:17:23.560208 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnplf\" (UniqueName: \"kubernetes.io/projected/49f4d206-cdfb-4963-9fe6-14153f253725-kube-api-access-xnplf\") pod \"watcher-b5f7-account-create-8nkls\" (UID: \"49f4d206-cdfb-4963-9fe6-14153f253725\") " pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" Dec 15 14:17:23 crc kubenswrapper[4794]: I1215 14:17:23.714556 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" Dec 15 14:17:24 crc kubenswrapper[4794]: I1215 14:17:24.187717 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-b5f7-account-create-8nkls"] Dec 15 14:17:24 crc kubenswrapper[4794]: I1215 14:17:24.256773 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" event={"ID":"49f4d206-cdfb-4963-9fe6-14153f253725","Type":"ContainerStarted","Data":"1efcc2d5a55326e06f949187354657c5d0e1790822dc4e529509028ee9bd3ae2"} Dec 15 14:17:25 crc kubenswrapper[4794]: I1215 14:17:25.264087 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" event={"ID":"49f4d206-cdfb-4963-9fe6-14153f253725","Type":"ContainerStarted","Data":"65825606fd7bd743ce43eb07d98d10115e303558b720ba3f0239a9c1c7486272"} Dec 15 14:17:25 crc kubenswrapper[4794]: I1215 14:17:25.281138 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" podStartSLOduration=2.281122963 podStartE2EDuration="2.281122963s" podCreationTimestamp="2025-12-15 14:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:17:25.277013237 +0000 UTC m=+1407.129035675" watchObservedRunningTime="2025-12-15 14:17:25.281122963 +0000 UTC m=+1407.133145401" Dec 15 14:17:26 crc kubenswrapper[4794]: I1215 14:17:26.272637 4794 generic.go:334] "Generic (PLEG): container finished" podID="49f4d206-cdfb-4963-9fe6-14153f253725" containerID="65825606fd7bd743ce43eb07d98d10115e303558b720ba3f0239a9c1c7486272" exitCode=0 Dec 15 14:17:26 crc kubenswrapper[4794]: I1215 14:17:26.272683 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" event={"ID":"49f4d206-cdfb-4963-9fe6-14153f253725","Type":"ContainerDied","Data":"65825606fd7bd743ce43eb07d98d10115e303558b720ba3f0239a9c1c7486272"} Dec 15 14:17:27 crc kubenswrapper[4794]: I1215 14:17:27.629316 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" Dec 15 14:17:27 crc kubenswrapper[4794]: I1215 14:17:27.735632 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnplf\" (UniqueName: \"kubernetes.io/projected/49f4d206-cdfb-4963-9fe6-14153f253725-kube-api-access-xnplf\") pod \"49f4d206-cdfb-4963-9fe6-14153f253725\" (UID: \"49f4d206-cdfb-4963-9fe6-14153f253725\") " Dec 15 14:17:27 crc kubenswrapper[4794]: I1215 14:17:27.740472 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f4d206-cdfb-4963-9fe6-14153f253725-kube-api-access-xnplf" (OuterVolumeSpecName: "kube-api-access-xnplf") pod "49f4d206-cdfb-4963-9fe6-14153f253725" (UID: "49f4d206-cdfb-4963-9fe6-14153f253725"). InnerVolumeSpecName "kube-api-access-xnplf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:27 crc kubenswrapper[4794]: I1215 14:17:27.837218 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnplf\" (UniqueName: \"kubernetes.io/projected/49f4d206-cdfb-4963-9fe6-14153f253725-kube-api-access-xnplf\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.295047 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" event={"ID":"49f4d206-cdfb-4963-9fe6-14153f253725","Type":"ContainerDied","Data":"1efcc2d5a55326e06f949187354657c5d0e1790822dc4e529509028ee9bd3ae2"} Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.295352 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1efcc2d5a55326e06f949187354657c5d0e1790822dc4e529509028ee9bd3ae2" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.295108 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b5f7-account-create-8nkls" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.669072 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-x7snh"] Dec 15 14:17:28 crc kubenswrapper[4794]: E1215 14:17:28.669497 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f4d206-cdfb-4963-9fe6-14153f253725" containerName="mariadb-account-create" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.669513 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f4d206-cdfb-4963-9fe6-14153f253725" containerName="mariadb-account-create" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.669798 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f4d206-cdfb-4963-9fe6-14153f253725" containerName="mariadb-account-create" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.670475 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.671961 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-98fhn" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.674038 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.685331 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-x7snh"] Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.749439 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.749497 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-db-sync-config-data\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.749774 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-config-data\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.749983 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqhdc\" (UniqueName: \"kubernetes.io/projected/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-kube-api-access-cqhdc\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.851745 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-config-data\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.851855 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqhdc\" (UniqueName: \"kubernetes.io/projected/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-kube-api-access-cqhdc\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.851924 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.852004 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-db-sync-config-data\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.858168 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-db-sync-config-data\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.858390 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-config-data\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.861200 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.869699 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqhdc\" (UniqueName: \"kubernetes.io/projected/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-kube-api-access-cqhdc\") pod \"watcher-kuttl-db-sync-x7snh\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:28 crc kubenswrapper[4794]: I1215 14:17:28.993042 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:29 crc kubenswrapper[4794]: I1215 14:17:29.478825 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-x7snh"] Dec 15 14:17:30 crc kubenswrapper[4794]: I1215 14:17:30.310817 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" event={"ID":"8fcc69bc-b36a-4367-bed5-e3a00fe1b701","Type":"ContainerStarted","Data":"4a9c8aec345f8741051df20d4437351ff67e6c017612f9ea4905e46f0d4a5a48"} Dec 15 14:17:30 crc kubenswrapper[4794]: I1215 14:17:30.311163 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" event={"ID":"8fcc69bc-b36a-4367-bed5-e3a00fe1b701","Type":"ContainerStarted","Data":"20381ca762e8f4481220197d92b45a5cc3d66b1686642541d349e2a6d0f22e4c"} Dec 15 14:17:30 crc kubenswrapper[4794]: I1215 14:17:30.330219 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" podStartSLOduration=2.330203489 podStartE2EDuration="2.330203489s" podCreationTimestamp="2025-12-15 14:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:17:30.323013016 +0000 UTC m=+1412.175035454" watchObservedRunningTime="2025-12-15 14:17:30.330203489 +0000 UTC m=+1412.182225937" Dec 15 14:17:32 crc kubenswrapper[4794]: I1215 14:17:32.326844 4794 generic.go:334] "Generic (PLEG): container finished" podID="8fcc69bc-b36a-4367-bed5-e3a00fe1b701" containerID="4a9c8aec345f8741051df20d4437351ff67e6c017612f9ea4905e46f0d4a5a48" exitCode=0 Dec 15 14:17:32 crc kubenswrapper[4794]: I1215 14:17:32.326955 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" event={"ID":"8fcc69bc-b36a-4367-bed5-e3a00fe1b701","Type":"ContainerDied","Data":"4a9c8aec345f8741051df20d4437351ff67e6c017612f9ea4905e46f0d4a5a48"} Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.649155 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.726510 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-config-data\") pod \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.726632 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqhdc\" (UniqueName: \"kubernetes.io/projected/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-kube-api-access-cqhdc\") pod \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.726682 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-db-sync-config-data\") pod \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.726701 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-combined-ca-bundle\") pod \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\" (UID: \"8fcc69bc-b36a-4367-bed5-e3a00fe1b701\") " Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.739841 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-kube-api-access-cqhdc" (OuterVolumeSpecName: "kube-api-access-cqhdc") pod "8fcc69bc-b36a-4367-bed5-e3a00fe1b701" (UID: "8fcc69bc-b36a-4367-bed5-e3a00fe1b701"). InnerVolumeSpecName "kube-api-access-cqhdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.739832 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8fcc69bc-b36a-4367-bed5-e3a00fe1b701" (UID: "8fcc69bc-b36a-4367-bed5-e3a00fe1b701"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.750451 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fcc69bc-b36a-4367-bed5-e3a00fe1b701" (UID: "8fcc69bc-b36a-4367-bed5-e3a00fe1b701"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.770784 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-config-data" (OuterVolumeSpecName: "config-data") pod "8fcc69bc-b36a-4367-bed5-e3a00fe1b701" (UID: "8fcc69bc-b36a-4367-bed5-e3a00fe1b701"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.832247 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.832282 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqhdc\" (UniqueName: \"kubernetes.io/projected/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-kube-api-access-cqhdc\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.832291 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:33 crc kubenswrapper[4794]: I1215 14:17:33.832301 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcc69bc-b36a-4367-bed5-e3a00fe1b701-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.344225 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" event={"ID":"8fcc69bc-b36a-4367-bed5-e3a00fe1b701","Type":"ContainerDied","Data":"20381ca762e8f4481220197d92b45a5cc3d66b1686642541d349e2a6d0f22e4c"} Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.344265 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20381ca762e8f4481220197d92b45a5cc3d66b1686642541d349e2a6d0f22e4c" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.344331 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-x7snh" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.585507 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:17:34 crc kubenswrapper[4794]: E1215 14:17:34.586233 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fcc69bc-b36a-4367-bed5-e3a00fe1b701" containerName="watcher-kuttl-db-sync" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.586260 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fcc69bc-b36a-4367-bed5-e3a00fe1b701" containerName="watcher-kuttl-db-sync" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.586491 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fcc69bc-b36a-4367-bed5-e3a00fe1b701" containerName="watcher-kuttl-db-sync" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.587514 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.589026 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-98fhn" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.589447 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.603094 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.690746 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.692235 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.694752 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.700418 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.708227 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.709545 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.712151 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.745653 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.746253 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.746312 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-logs\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.746335 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.746399 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnww9\" (UniqueName: \"kubernetes.io/projected/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-kube-api-access-nnww9\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.752882 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847069 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847112 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847138 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tztlq\" (UniqueName: \"kubernetes.io/projected/232868d5-8a61-4490-b2bf-c177e6d4d2bf-kube-api-access-tztlq\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847175 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847191 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86346c16-baaa-4a00-a83a-1a48ed91b7fd-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847355 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847472 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfc5t\" (UniqueName: \"kubernetes.io/projected/86346c16-baaa-4a00-a83a-1a48ed91b7fd-kube-api-access-gfc5t\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847636 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847736 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847823 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847871 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-logs\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847900 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232868d5-8a61-4490-b2bf-c177e6d4d2bf-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847934 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.847988 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnww9\" (UniqueName: \"kubernetes.io/projected/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-kube-api-access-nnww9\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.848279 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-logs\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.854367 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.854709 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.863463 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.863661 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnww9\" (UniqueName: \"kubernetes.io/projected/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-kube-api-access-nnww9\") pod \"watcher-kuttl-api-0\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.948567 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.948644 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tztlq\" (UniqueName: \"kubernetes.io/projected/232868d5-8a61-4490-b2bf-c177e6d4d2bf-kube-api-access-tztlq\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.948672 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.948694 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86346c16-baaa-4a00-a83a-1a48ed91b7fd-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.948737 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.948775 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfc5t\" (UniqueName: \"kubernetes.io/projected/86346c16-baaa-4a00-a83a-1a48ed91b7fd-kube-api-access-gfc5t\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.948818 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.948856 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.948882 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232868d5-8a61-4490-b2bf-c177e6d4d2bf-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.949402 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86346c16-baaa-4a00-a83a-1a48ed91b7fd-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.949429 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232868d5-8a61-4490-b2bf-c177e6d4d2bf-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.951515 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.951630 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.952639 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.952988 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.955116 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.959940 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.964513 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfc5t\" (UniqueName: \"kubernetes.io/projected/86346c16-baaa-4a00-a83a-1a48ed91b7fd-kube-api-access-gfc5t\") pod \"watcher-kuttl-applier-0\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:34 crc kubenswrapper[4794]: I1215 14:17:34.967324 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tztlq\" (UniqueName: \"kubernetes.io/projected/232868d5-8a61-4490-b2bf-c177e6d4d2bf-kube-api-access-tztlq\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:35 crc kubenswrapper[4794]: I1215 14:17:35.011902 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:35 crc kubenswrapper[4794]: I1215 14:17:35.028209 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:35 crc kubenswrapper[4794]: I1215 14:17:35.452398 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:17:35 crc kubenswrapper[4794]: W1215 14:17:35.453120 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd57a6462_0dd7_4b07_b27f_be6b6e853ff5.slice/crio-b1a3ad8467ff32fe1f92af9f0a8fc79eff12947386750d457084466bef3ef4ed WatchSource:0}: Error finding container b1a3ad8467ff32fe1f92af9f0a8fc79eff12947386750d457084466bef3ef4ed: Status 404 returned error can't find the container with id b1a3ad8467ff32fe1f92af9f0a8fc79eff12947386750d457084466bef3ef4ed Dec 15 14:17:35 crc kubenswrapper[4794]: I1215 14:17:35.563044 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:17:35 crc kubenswrapper[4794]: I1215 14:17:35.604143 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:17:36 crc kubenswrapper[4794]: I1215 14:17:36.370506 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d57a6462-0dd7-4b07-b27f-be6b6e853ff5","Type":"ContainerStarted","Data":"3eb5e8db7f6d110015a204e380537e012ced41e9da4046500a1948d2fd3512bf"} Dec 15 14:17:36 crc kubenswrapper[4794]: I1215 14:17:36.371766 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d57a6462-0dd7-4b07-b27f-be6b6e853ff5","Type":"ContainerStarted","Data":"13aea039e50d54d7aebd2bd1dcffec0a308810e149aefaa04836426cee5e82d2"} Dec 15 14:17:36 crc kubenswrapper[4794]: I1215 14:17:36.371841 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:36 crc kubenswrapper[4794]: I1215 14:17:36.371895 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d57a6462-0dd7-4b07-b27f-be6b6e853ff5","Type":"ContainerStarted","Data":"b1a3ad8467ff32fe1f92af9f0a8fc79eff12947386750d457084466bef3ef4ed"} Dec 15 14:17:36 crc kubenswrapper[4794]: I1215 14:17:36.372526 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"232868d5-8a61-4490-b2bf-c177e6d4d2bf","Type":"ContainerStarted","Data":"d55e93104813b6f95437a52ec88f955d77e211c969b2599b760d3996b2939d03"} Dec 15 14:17:36 crc kubenswrapper[4794]: I1215 14:17:36.372573 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"232868d5-8a61-4490-b2bf-c177e6d4d2bf","Type":"ContainerStarted","Data":"5788a8356a33669ca8fa1c2a5276d8598ae0b55bc51a5c839e61f728495b8a8c"} Dec 15 14:17:36 crc kubenswrapper[4794]: I1215 14:17:36.374017 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"86346c16-baaa-4a00-a83a-1a48ed91b7fd","Type":"ContainerStarted","Data":"ce9079b1fb774be46563c4768c06567343f9460b774d91514174642a3c22d5de"} Dec 15 14:17:36 crc kubenswrapper[4794]: I1215 14:17:36.374063 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"86346c16-baaa-4a00-a83a-1a48ed91b7fd","Type":"ContainerStarted","Data":"6dc9418556886cbab4216e684655a32b46011addb4f19551ac65190652b1ce03"} Dec 15 14:17:36 crc kubenswrapper[4794]: I1215 14:17:36.397929 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.3979113229999998 podStartE2EDuration="2.397911323s" podCreationTimestamp="2025-12-15 14:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:17:36.391509862 +0000 UTC m=+1418.243532300" watchObservedRunningTime="2025-12-15 14:17:36.397911323 +0000 UTC m=+1418.249933761" Dec 15 14:17:36 crc kubenswrapper[4794]: I1215 14:17:36.411444 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.4114248050000002 podStartE2EDuration="2.411424805s" podCreationTimestamp="2025-12-15 14:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:17:36.40770019 +0000 UTC m=+1418.259722648" watchObservedRunningTime="2025-12-15 14:17:36.411424805 +0000 UTC m=+1418.263447253" Dec 15 14:17:36 crc kubenswrapper[4794]: I1215 14:17:36.424399 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.424384141 podStartE2EDuration="2.424384141s" podCreationTimestamp="2025-12-15 14:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:17:36.421729816 +0000 UTC m=+1418.273752284" watchObservedRunningTime="2025-12-15 14:17:36.424384141 +0000 UTC m=+1418.276406579" Dec 15 14:17:38 crc kubenswrapper[4794]: I1215 14:17:38.387803 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 14:17:38 crc kubenswrapper[4794]: I1215 14:17:38.578228 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:39 crc kubenswrapper[4794]: I1215 14:17:39.960099 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:40 crc kubenswrapper[4794]: I1215 14:17:40.012376 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:40 crc kubenswrapper[4794]: I1215 14:17:40.887946 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.474048 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mvmxr"] Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.476166 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.492476 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvmxr"] Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.521982 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-catalog-content\") pod \"redhat-operators-mvmxr\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.522082 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmfp6\" (UniqueName: \"kubernetes.io/projected/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-kube-api-access-jmfp6\") pod \"redhat-operators-mvmxr\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.522124 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-utilities\") pod \"redhat-operators-mvmxr\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.623529 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-catalog-content\") pod \"redhat-operators-mvmxr\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.623661 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmfp6\" (UniqueName: \"kubernetes.io/projected/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-kube-api-access-jmfp6\") pod \"redhat-operators-mvmxr\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.623695 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-utilities\") pod \"redhat-operators-mvmxr\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.624514 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-utilities\") pod \"redhat-operators-mvmxr\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.624509 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-catalog-content\") pod \"redhat-operators-mvmxr\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.647656 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmfp6\" (UniqueName: \"kubernetes.io/projected/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-kube-api-access-jmfp6\") pod \"redhat-operators-mvmxr\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.841907 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.963735 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:44 crc kubenswrapper[4794]: I1215 14:17:44.970937 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:45 crc kubenswrapper[4794]: I1215 14:17:45.013181 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:45 crc kubenswrapper[4794]: I1215 14:17:45.034795 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:45 crc kubenswrapper[4794]: I1215 14:17:45.069624 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:45 crc kubenswrapper[4794]: I1215 14:17:45.072863 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:45 crc kubenswrapper[4794]: I1215 14:17:45.320215 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvmxr"] Dec 15 14:17:45 crc kubenswrapper[4794]: I1215 14:17:45.457614 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvmxr" event={"ID":"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5","Type":"ContainerStarted","Data":"eb3de9fd5404934f668b4e824b0e0d95c6d0bcf14421c788ff19580388c533fc"} Dec 15 14:17:45 crc kubenswrapper[4794]: I1215 14:17:45.458084 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:45 crc kubenswrapper[4794]: I1215 14:17:45.462805 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:45 crc kubenswrapper[4794]: I1215 14:17:45.490578 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:45 crc kubenswrapper[4794]: I1215 14:17:45.494284 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:46 crc kubenswrapper[4794]: I1215 14:17:46.467638 4794 generic.go:334] "Generic (PLEG): container finished" podID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerID="bf2b48d21e0f644c85f5a720ca36e0fe3b0593276e3a66d61ee9eaea92924d43" exitCode=0 Dec 15 14:17:46 crc kubenswrapper[4794]: I1215 14:17:46.468929 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvmxr" event={"ID":"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5","Type":"ContainerDied","Data":"bf2b48d21e0f644c85f5a720ca36e0fe3b0593276e3a66d61ee9eaea92924d43"} Dec 15 14:17:46 crc kubenswrapper[4794]: I1215 14:17:46.859120 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:46 crc kubenswrapper[4794]: I1215 14:17:46.859762 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="ceilometer-central-agent" containerID="cri-o://f2f13d4598186a15ef71ef82b148daedd6320a061ef8537700cd6be39ad72734" gracePeriod=30 Dec 15 14:17:46 crc kubenswrapper[4794]: I1215 14:17:46.859847 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="ceilometer-notification-agent" containerID="cri-o://287e1e480948b9a4a7f73ab232646f004e55102f82d44f78aa22b2c00e73b541" gracePeriod=30 Dec 15 14:17:46 crc kubenswrapper[4794]: I1215 14:17:46.859850 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="sg-core" containerID="cri-o://71a38b5614c39ed2e3cbebdd6ff1110fe91f167880bf587807581b783339c528" gracePeriod=30 Dec 15 14:17:46 crc kubenswrapper[4794]: I1215 14:17:46.859956 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="proxy-httpd" containerID="cri-o://221a177c183b66018de199e76eebda8cf275e0c7570cb5807add6f5df4db4a11" gracePeriod=30 Dec 15 14:17:47 crc kubenswrapper[4794]: I1215 14:17:47.480212 4794 generic.go:334] "Generic (PLEG): container finished" podID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerID="221a177c183b66018de199e76eebda8cf275e0c7570cb5807add6f5df4db4a11" exitCode=0 Dec 15 14:17:47 crc kubenswrapper[4794]: I1215 14:17:47.480530 4794 generic.go:334] "Generic (PLEG): container finished" podID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerID="71a38b5614c39ed2e3cbebdd6ff1110fe91f167880bf587807581b783339c528" exitCode=2 Dec 15 14:17:47 crc kubenswrapper[4794]: I1215 14:17:47.480543 4794 generic.go:334] "Generic (PLEG): container finished" podID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerID="f2f13d4598186a15ef71ef82b148daedd6320a061ef8537700cd6be39ad72734" exitCode=0 Dec 15 14:17:47 crc kubenswrapper[4794]: I1215 14:17:47.480247 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"107182a6-ee54-4ab4-8da4-fbc82912954c","Type":"ContainerDied","Data":"221a177c183b66018de199e76eebda8cf275e0c7570cb5807add6f5df4db4a11"} Dec 15 14:17:47 crc kubenswrapper[4794]: I1215 14:17:47.480647 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"107182a6-ee54-4ab4-8da4-fbc82912954c","Type":"ContainerDied","Data":"71a38b5614c39ed2e3cbebdd6ff1110fe91f167880bf587807581b783339c528"} Dec 15 14:17:47 crc kubenswrapper[4794]: I1215 14:17:47.480665 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"107182a6-ee54-4ab4-8da4-fbc82912954c","Type":"ContainerDied","Data":"f2f13d4598186a15ef71ef82b148daedd6320a061ef8537700cd6be39ad72734"} Dec 15 14:17:47 crc kubenswrapper[4794]: I1215 14:17:47.482694 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvmxr" event={"ID":"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5","Type":"ContainerStarted","Data":"47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1"} Dec 15 14:17:47 crc kubenswrapper[4794]: I1215 14:17:47.953831 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-x7snh"] Dec 15 14:17:47 crc kubenswrapper[4794]: I1215 14:17:47.962771 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-x7snh"] Dec 15 14:17:47 crc kubenswrapper[4794]: I1215 14:17:47.995924 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcherb5f7-account-delete-6jl4b"] Dec 15 14:17:47 crc kubenswrapper[4794]: I1215 14:17:47.997250 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.007324 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcherb5f7-account-delete-6jl4b"] Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.028729 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.060635 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.060845 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="86346c16-baaa-4a00-a83a-1a48ed91b7fd" containerName="watcher-applier" containerID="cri-o://ce9079b1fb774be46563c4768c06567343f9460b774d91514174642a3c22d5de" gracePeriod=30 Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.108084 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.108333 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="d57a6462-0dd7-4b07-b27f-be6b6e853ff5" containerName="watcher-kuttl-api-log" containerID="cri-o://13aea039e50d54d7aebd2bd1dcffec0a308810e149aefaa04836426cee5e82d2" gracePeriod=30 Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.108504 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="d57a6462-0dd7-4b07-b27f-be6b6e853ff5" containerName="watcher-api" containerID="cri-o://3eb5e8db7f6d110015a204e380537e012ced41e9da4046500a1948d2fd3512bf" gracePeriod=30 Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.181608 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlz4j\" (UniqueName: \"kubernetes.io/projected/25ccfe1a-c34c-4a15-900e-de61130599e5-kube-api-access-zlz4j\") pod \"watcherb5f7-account-delete-6jl4b\" (UID: \"25ccfe1a-c34c-4a15-900e-de61130599e5\") " pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.284208 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlz4j\" (UniqueName: \"kubernetes.io/projected/25ccfe1a-c34c-4a15-900e-de61130599e5-kube-api-access-zlz4j\") pod \"watcherb5f7-account-delete-6jl4b\" (UID: \"25ccfe1a-c34c-4a15-900e-de61130599e5\") " pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.307732 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlz4j\" (UniqueName: \"kubernetes.io/projected/25ccfe1a-c34c-4a15-900e-de61130599e5-kube-api-access-zlz4j\") pod \"watcherb5f7-account-delete-6jl4b\" (UID: \"25ccfe1a-c34c-4a15-900e-de61130599e5\") " pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.315657 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.514842 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="232868d5-8a61-4490-b2bf-c177e6d4d2bf" containerName="watcher-decision-engine" containerID="cri-o://d55e93104813b6f95437a52ec88f955d77e211c969b2599b760d3996b2939d03" gracePeriod=30 Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.752017 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fcc69bc-b36a-4367-bed5-e3a00fe1b701" path="/var/lib/kubelet/pods/8fcc69bc-b36a-4367-bed5-e3a00fe1b701/volumes" Dec 15 14:17:48 crc kubenswrapper[4794]: I1215 14:17:48.843084 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcherb5f7-account-delete-6jl4b"] Dec 15 14:17:48 crc kubenswrapper[4794]: W1215 14:17:48.871784 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ccfe1a_c34c_4a15_900e_de61130599e5.slice/crio-a57fedbb70d9a1517d63401c316552cfabd00ee84e2251c56002ec40860a105a WatchSource:0}: Error finding container a57fedbb70d9a1517d63401c316552cfabd00ee84e2251c56002ec40860a105a: Status 404 returned error can't find the container with id a57fedbb70d9a1517d63401c316552cfabd00ee84e2251c56002ec40860a105a Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.632856 4794 generic.go:334] "Generic (PLEG): container finished" podID="d57a6462-0dd7-4b07-b27f-be6b6e853ff5" containerID="3eb5e8db7f6d110015a204e380537e012ced41e9da4046500a1948d2fd3512bf" exitCode=0 Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.633191 4794 generic.go:334] "Generic (PLEG): container finished" podID="d57a6462-0dd7-4b07-b27f-be6b6e853ff5" containerID="13aea039e50d54d7aebd2bd1dcffec0a308810e149aefaa04836426cee5e82d2" exitCode=143 Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.633277 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d57a6462-0dd7-4b07-b27f-be6b6e853ff5","Type":"ContainerDied","Data":"3eb5e8db7f6d110015a204e380537e012ced41e9da4046500a1948d2fd3512bf"} Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.633309 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d57a6462-0dd7-4b07-b27f-be6b6e853ff5","Type":"ContainerDied","Data":"13aea039e50d54d7aebd2bd1dcffec0a308810e149aefaa04836426cee5e82d2"} Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.650794 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" event={"ID":"25ccfe1a-c34c-4a15-900e-de61130599e5","Type":"ContainerStarted","Data":"1ba27cc53fa27620ff5b6f4db71b3310018a2f348c70d6d3bfb7eabd5585e8e4"} Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.650842 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" event={"ID":"25ccfe1a-c34c-4a15-900e-de61130599e5","Type":"ContainerStarted","Data":"a57fedbb70d9a1517d63401c316552cfabd00ee84e2251c56002ec40860a105a"} Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.681902 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" podStartSLOduration=2.681887781 podStartE2EDuration="2.681887781s" podCreationTimestamp="2025-12-15 14:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:17:49.678924418 +0000 UTC m=+1431.530946856" watchObservedRunningTime="2025-12-15 14:17:49.681887781 +0000 UTC m=+1431.533910219" Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.704244 4794 generic.go:334] "Generic (PLEG): container finished" podID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerID="287e1e480948b9a4a7f73ab232646f004e55102f82d44f78aa22b2c00e73b541" exitCode=0 Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.704289 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"107182a6-ee54-4ab4-8da4-fbc82912954c","Type":"ContainerDied","Data":"287e1e480948b9a4a7f73ab232646f004e55102f82d44f78aa22b2c00e73b541"} Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.872326 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.986406 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnww9\" (UniqueName: \"kubernetes.io/projected/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-kube-api-access-nnww9\") pod \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.986520 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-custom-prometheus-ca\") pod \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.986550 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-combined-ca-bundle\") pod \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.986750 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-config-data\") pod \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.986812 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-logs\") pod \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\" (UID: \"d57a6462-0dd7-4b07-b27f-be6b6e853ff5\") " Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.987243 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-logs" (OuterVolumeSpecName: "logs") pod "d57a6462-0dd7-4b07-b27f-be6b6e853ff5" (UID: "d57a6462-0dd7-4b07-b27f-be6b6e853ff5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:49 crc kubenswrapper[4794]: I1215 14:17:49.992707 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-kube-api-access-nnww9" (OuterVolumeSpecName: "kube-api-access-nnww9") pod "d57a6462-0dd7-4b07-b27f-be6b6e853ff5" (UID: "d57a6462-0dd7-4b07-b27f-be6b6e853ff5"). InnerVolumeSpecName "kube-api-access-nnww9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:50 crc kubenswrapper[4794]: E1215 14:17:50.020936 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce9079b1fb774be46563c4768c06567343f9460b774d91514174642a3c22d5de" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:17:50 crc kubenswrapper[4794]: E1215 14:17:50.022277 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce9079b1fb774be46563c4768c06567343f9460b774d91514174642a3c22d5de" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:17:50 crc kubenswrapper[4794]: E1215 14:17:50.023276 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce9079b1fb774be46563c4768c06567343f9460b774d91514174642a3c22d5de" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:17:50 crc kubenswrapper[4794]: E1215 14:17:50.023338 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="86346c16-baaa-4a00-a83a-1a48ed91b7fd" containerName="watcher-applier" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.033796 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "d57a6462-0dd7-4b07-b27f-be6b6e853ff5" (UID: "d57a6462-0dd7-4b07-b27f-be6b6e853ff5"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.034478 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-config-data" (OuterVolumeSpecName: "config-data") pod "d57a6462-0dd7-4b07-b27f-be6b6e853ff5" (UID: "d57a6462-0dd7-4b07-b27f-be6b6e853ff5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.034941 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d57a6462-0dd7-4b07-b27f-be6b6e853ff5" (UID: "d57a6462-0dd7-4b07-b27f-be6b6e853ff5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.088893 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.088946 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.088960 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnww9\" (UniqueName: \"kubernetes.io/projected/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-kube-api-access-nnww9\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.088974 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.088988 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57a6462-0dd7-4b07-b27f-be6b6e853ff5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.728807 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d57a6462-0dd7-4b07-b27f-be6b6e853ff5","Type":"ContainerDied","Data":"b1a3ad8467ff32fe1f92af9f0a8fc79eff12947386750d457084466bef3ef4ed"} Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.729109 4794 scope.go:117] "RemoveContainer" containerID="3eb5e8db7f6d110015a204e380537e012ced41e9da4046500a1948d2fd3512bf" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.728848 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.730848 4794 generic.go:334] "Generic (PLEG): container finished" podID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerID="47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1" exitCode=0 Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.730920 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvmxr" event={"ID":"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5","Type":"ContainerDied","Data":"47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1"} Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.774628 4794 scope.go:117] "RemoveContainer" containerID="13aea039e50d54d7aebd2bd1dcffec0a308810e149aefaa04836426cee5e82d2" Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.785242 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:17:50 crc kubenswrapper[4794]: I1215 14:17:50.791999 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.041954 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.103817 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-run-httpd\") pod \"107182a6-ee54-4ab4-8da4-fbc82912954c\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.103860 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9pmb\" (UniqueName: \"kubernetes.io/projected/107182a6-ee54-4ab4-8da4-fbc82912954c-kube-api-access-v9pmb\") pod \"107182a6-ee54-4ab4-8da4-fbc82912954c\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.103892 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-scripts\") pod \"107182a6-ee54-4ab4-8da4-fbc82912954c\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.103923 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-config-data\") pod \"107182a6-ee54-4ab4-8da4-fbc82912954c\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.103938 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-combined-ca-bundle\") pod \"107182a6-ee54-4ab4-8da4-fbc82912954c\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.103987 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-sg-core-conf-yaml\") pod \"107182a6-ee54-4ab4-8da4-fbc82912954c\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.104021 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-log-httpd\") pod \"107182a6-ee54-4ab4-8da4-fbc82912954c\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.104061 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-ceilometer-tls-certs\") pod \"107182a6-ee54-4ab4-8da4-fbc82912954c\" (UID: \"107182a6-ee54-4ab4-8da4-fbc82912954c\") " Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.104181 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "107182a6-ee54-4ab4-8da4-fbc82912954c" (UID: "107182a6-ee54-4ab4-8da4-fbc82912954c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.104980 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "107182a6-ee54-4ab4-8da4-fbc82912954c" (UID: "107182a6-ee54-4ab4-8da4-fbc82912954c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.105355 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.105377 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/107182a6-ee54-4ab4-8da4-fbc82912954c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.111178 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-scripts" (OuterVolumeSpecName: "scripts") pod "107182a6-ee54-4ab4-8da4-fbc82912954c" (UID: "107182a6-ee54-4ab4-8da4-fbc82912954c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.111491 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107182a6-ee54-4ab4-8da4-fbc82912954c-kube-api-access-v9pmb" (OuterVolumeSpecName: "kube-api-access-v9pmb") pod "107182a6-ee54-4ab4-8da4-fbc82912954c" (UID: "107182a6-ee54-4ab4-8da4-fbc82912954c"). InnerVolumeSpecName "kube-api-access-v9pmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.140753 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "107182a6-ee54-4ab4-8da4-fbc82912954c" (UID: "107182a6-ee54-4ab4-8da4-fbc82912954c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.190776 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "107182a6-ee54-4ab4-8da4-fbc82912954c" (UID: "107182a6-ee54-4ab4-8da4-fbc82912954c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.199959 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "107182a6-ee54-4ab4-8da4-fbc82912954c" (UID: "107182a6-ee54-4ab4-8da4-fbc82912954c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.205635 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-config-data" (OuterVolumeSpecName: "config-data") pod "107182a6-ee54-4ab4-8da4-fbc82912954c" (UID: "107182a6-ee54-4ab4-8da4-fbc82912954c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.206353 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.206374 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.206383 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9pmb\" (UniqueName: \"kubernetes.io/projected/107182a6-ee54-4ab4-8da4-fbc82912954c-kube-api-access-v9pmb\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.206393 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.206401 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.206409 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107182a6-ee54-4ab4-8da4-fbc82912954c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.742978 4794 generic.go:334] "Generic (PLEG): container finished" podID="25ccfe1a-c34c-4a15-900e-de61130599e5" containerID="1ba27cc53fa27620ff5b6f4db71b3310018a2f348c70d6d3bfb7eabd5585e8e4" exitCode=0 Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.743037 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" event={"ID":"25ccfe1a-c34c-4a15-900e-de61130599e5","Type":"ContainerDied","Data":"1ba27cc53fa27620ff5b6f4db71b3310018a2f348c70d6d3bfb7eabd5585e8e4"} Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.747071 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"107182a6-ee54-4ab4-8da4-fbc82912954c","Type":"ContainerDied","Data":"66ae898ec316d3133501a3f96d0a6138224dfefbd124862492eb9e977c672717"} Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.747135 4794 scope.go:117] "RemoveContainer" containerID="221a177c183b66018de199e76eebda8cf275e0c7570cb5807add6f5df4db4a11" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.747172 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.751202 4794 generic.go:334] "Generic (PLEG): container finished" podID="86346c16-baaa-4a00-a83a-1a48ed91b7fd" containerID="ce9079b1fb774be46563c4768c06567343f9460b774d91514174642a3c22d5de" exitCode=0 Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.751282 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"86346c16-baaa-4a00-a83a-1a48ed91b7fd","Type":"ContainerDied","Data":"ce9079b1fb774be46563c4768c06567343f9460b774d91514174642a3c22d5de"} Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.792834 4794 scope.go:117] "RemoveContainer" containerID="71a38b5614c39ed2e3cbebdd6ff1110fe91f167880bf587807581b783339c528" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.794222 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.804517 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.812946 4794 scope.go:117] "RemoveContainer" containerID="287e1e480948b9a4a7f73ab232646f004e55102f82d44f78aa22b2c00e73b541" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.829703 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:51 crc kubenswrapper[4794]: E1215 14:17:51.830117 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="proxy-httpd" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.830134 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="proxy-httpd" Dec 15 14:17:51 crc kubenswrapper[4794]: E1215 14:17:51.830150 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="ceilometer-central-agent" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.830158 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="ceilometer-central-agent" Dec 15 14:17:51 crc kubenswrapper[4794]: E1215 14:17:51.830170 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="ceilometer-notification-agent" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.830177 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="ceilometer-notification-agent" Dec 15 14:17:51 crc kubenswrapper[4794]: E1215 14:17:51.830188 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="sg-core" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.830195 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="sg-core" Dec 15 14:17:51 crc kubenswrapper[4794]: E1215 14:17:51.830206 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57a6462-0dd7-4b07-b27f-be6b6e853ff5" containerName="watcher-kuttl-api-log" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.830214 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57a6462-0dd7-4b07-b27f-be6b6e853ff5" containerName="watcher-kuttl-api-log" Dec 15 14:17:51 crc kubenswrapper[4794]: E1215 14:17:51.831571 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57a6462-0dd7-4b07-b27f-be6b6e853ff5" containerName="watcher-api" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.831595 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57a6462-0dd7-4b07-b27f-be6b6e853ff5" containerName="watcher-api" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.831771 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="ceilometer-central-agent" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.831785 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="ceilometer-notification-agent" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.831797 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="sg-core" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.831809 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57a6462-0dd7-4b07-b27f-be6b6e853ff5" containerName="watcher-api" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.831819 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" containerName="proxy-httpd" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.831829 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57a6462-0dd7-4b07-b27f-be6b6e853ff5" containerName="watcher-kuttl-api-log" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.834595 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.842356 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.842789 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.843959 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.844206 4794 scope.go:117] "RemoveContainer" containerID="f2f13d4598186a15ef71ef82b148daedd6320a061ef8537700cd6be39ad72734" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.859310 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.916413 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-run-httpd\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.916463 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-scripts\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.916487 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m67r5\" (UniqueName: \"kubernetes.io/projected/77fe0cdf-9f81-47f7-b888-adaf749e562a-kube-api-access-m67r5\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.916527 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.916551 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.916614 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-config-data\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.916638 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-log-httpd\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:51 crc kubenswrapper[4794]: I1215 14:17:51.916664 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.017884 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-scripts\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.017944 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m67r5\" (UniqueName: \"kubernetes.io/projected/77fe0cdf-9f81-47f7-b888-adaf749e562a-kube-api-access-m67r5\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.018023 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.018051 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.018104 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-config-data\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.018130 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-log-httpd\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.018162 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.018232 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-run-httpd\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.019012 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-log-httpd\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.019134 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-run-httpd\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.023189 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.023294 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-config-data\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.023658 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.023823 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.025637 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-scripts\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.046883 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m67r5\" (UniqueName: \"kubernetes.io/projected/77fe0cdf-9f81-47f7-b888-adaf749e562a-kube-api-access-m67r5\") pod \"ceilometer-0\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.111665 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.165763 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.166439 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.220723 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86346c16-baaa-4a00-a83a-1a48ed91b7fd-logs\") pod \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.220828 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-combined-ca-bundle\") pod \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.220910 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-config-data\") pod \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.220997 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfc5t\" (UniqueName: \"kubernetes.io/projected/86346c16-baaa-4a00-a83a-1a48ed91b7fd-kube-api-access-gfc5t\") pod \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\" (UID: \"86346c16-baaa-4a00-a83a-1a48ed91b7fd\") " Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.221814 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86346c16-baaa-4a00-a83a-1a48ed91b7fd-logs" (OuterVolumeSpecName: "logs") pod "86346c16-baaa-4a00-a83a-1a48ed91b7fd" (UID: "86346c16-baaa-4a00-a83a-1a48ed91b7fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.226171 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86346c16-baaa-4a00-a83a-1a48ed91b7fd-kube-api-access-gfc5t" (OuterVolumeSpecName: "kube-api-access-gfc5t") pod "86346c16-baaa-4a00-a83a-1a48ed91b7fd" (UID: "86346c16-baaa-4a00-a83a-1a48ed91b7fd"). InnerVolumeSpecName "kube-api-access-gfc5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.261149 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86346c16-baaa-4a00-a83a-1a48ed91b7fd" (UID: "86346c16-baaa-4a00-a83a-1a48ed91b7fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.275776 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-config-data" (OuterVolumeSpecName: "config-data") pod "86346c16-baaa-4a00-a83a-1a48ed91b7fd" (UID: "86346c16-baaa-4a00-a83a-1a48ed91b7fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.322494 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.322523 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfc5t\" (UniqueName: \"kubernetes.io/projected/86346c16-baaa-4a00-a83a-1a48ed91b7fd-kube-api-access-gfc5t\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.322534 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86346c16-baaa-4a00-a83a-1a48ed91b7fd-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.322543 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86346c16-baaa-4a00-a83a-1a48ed91b7fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.626011 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.749183 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107182a6-ee54-4ab4-8da4-fbc82912954c" path="/var/lib/kubelet/pods/107182a6-ee54-4ab4-8da4-fbc82912954c/volumes" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.750110 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d57a6462-0dd7-4b07-b27f-be6b6e853ff5" path="/var/lib/kubelet/pods/d57a6462-0dd7-4b07-b27f-be6b6e853ff5/volumes" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.773028 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"86346c16-baaa-4a00-a83a-1a48ed91b7fd","Type":"ContainerDied","Data":"6dc9418556886cbab4216e684655a32b46011addb4f19551ac65190652b1ce03"} Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.773098 4794 scope.go:117] "RemoveContainer" containerID="ce9079b1fb774be46563c4768c06567343f9460b774d91514174642a3c22d5de" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.773119 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.787679 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvmxr" event={"ID":"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5","Type":"ContainerStarted","Data":"c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b"} Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.789337 4794 generic.go:334] "Generic (PLEG): container finished" podID="232868d5-8a61-4490-b2bf-c177e6d4d2bf" containerID="d55e93104813b6f95437a52ec88f955d77e211c969b2599b760d3996b2939d03" exitCode=0 Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.789441 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"232868d5-8a61-4490-b2bf-c177e6d4d2bf","Type":"ContainerDied","Data":"d55e93104813b6f95437a52ec88f955d77e211c969b2599b760d3996b2939d03"} Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.792271 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77fe0cdf-9f81-47f7-b888-adaf749e562a","Type":"ContainerStarted","Data":"32e5ac44ff2057b26df37ede2a30e4a5597003fa99703b0343d23c8056d6709e"} Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.808503 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mvmxr" podStartSLOduration=3.124011835 podStartE2EDuration="8.808484615s" podCreationTimestamp="2025-12-15 14:17:44 +0000 UTC" firstStartedPulling="2025-12-15 14:17:46.469637526 +0000 UTC m=+1428.321659964" lastFinishedPulling="2025-12-15 14:17:52.154110306 +0000 UTC m=+1434.006132744" observedRunningTime="2025-12-15 14:17:52.807561219 +0000 UTC m=+1434.659583657" watchObservedRunningTime="2025-12-15 14:17:52.808484615 +0000 UTC m=+1434.660507053" Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.832185 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.847291 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:17:52 crc kubenswrapper[4794]: I1215 14:17:52.963754 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.034761 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tztlq\" (UniqueName: \"kubernetes.io/projected/232868d5-8a61-4490-b2bf-c177e6d4d2bf-kube-api-access-tztlq\") pod \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.034864 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-combined-ca-bundle\") pod \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.034970 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232868d5-8a61-4490-b2bf-c177e6d4d2bf-logs\") pod \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.035017 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-config-data\") pod \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.035114 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-custom-prometheus-ca\") pod \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\" (UID: \"232868d5-8a61-4490-b2bf-c177e6d4d2bf\") " Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.039928 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/232868d5-8a61-4490-b2bf-c177e6d4d2bf-logs" (OuterVolumeSpecName: "logs") pod "232868d5-8a61-4490-b2bf-c177e6d4d2bf" (UID: "232868d5-8a61-4490-b2bf-c177e6d4d2bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.050708 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-cns42"] Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.053826 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/232868d5-8a61-4490-b2bf-c177e6d4d2bf-kube-api-access-tztlq" (OuterVolumeSpecName: "kube-api-access-tztlq") pod "232868d5-8a61-4490-b2bf-c177e6d4d2bf" (UID: "232868d5-8a61-4490-b2bf-c177e6d4d2bf"). InnerVolumeSpecName "kube-api-access-tztlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.059929 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-cns42"] Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.059931 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "232868d5-8a61-4490-b2bf-c177e6d4d2bf" (UID: "232868d5-8a61-4490-b2bf-c177e6d4d2bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.071255 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "232868d5-8a61-4490-b2bf-c177e6d4d2bf" (UID: "232868d5-8a61-4490-b2bf-c177e6d4d2bf"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.074228 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcherb5f7-account-delete-6jl4b"] Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.082659 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-b5f7-account-create-8nkls"] Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.087332 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-b5f7-account-create-8nkls"] Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.097767 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-config-data" (OuterVolumeSpecName: "config-data") pod "232868d5-8a61-4490-b2bf-c177e6d4d2bf" (UID: "232868d5-8a61-4490-b2bf-c177e6d4d2bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.125642 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.138335 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.138363 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tztlq\" (UniqueName: \"kubernetes.io/projected/232868d5-8a61-4490-b2bf-c177e6d4d2bf-kube-api-access-tztlq\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.138374 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.138382 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232868d5-8a61-4490-b2bf-c177e6d4d2bf-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.138392 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232868d5-8a61-4490-b2bf-c177e6d4d2bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.241086 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlz4j\" (UniqueName: \"kubernetes.io/projected/25ccfe1a-c34c-4a15-900e-de61130599e5-kube-api-access-zlz4j\") pod \"25ccfe1a-c34c-4a15-900e-de61130599e5\" (UID: \"25ccfe1a-c34c-4a15-900e-de61130599e5\") " Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.245026 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ccfe1a-c34c-4a15-900e-de61130599e5-kube-api-access-zlz4j" (OuterVolumeSpecName: "kube-api-access-zlz4j") pod "25ccfe1a-c34c-4a15-900e-de61130599e5" (UID: "25ccfe1a-c34c-4a15-900e-de61130599e5"). InnerVolumeSpecName "kube-api-access-zlz4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.342573 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlz4j\" (UniqueName: \"kubernetes.io/projected/25ccfe1a-c34c-4a15-900e-de61130599e5-kube-api-access-zlz4j\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.803269 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"232868d5-8a61-4490-b2bf-c177e6d4d2bf","Type":"ContainerDied","Data":"5788a8356a33669ca8fa1c2a5276d8598ae0b55bc51a5c839e61f728495b8a8c"} Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.803355 4794 scope.go:117] "RemoveContainer" containerID="d55e93104813b6f95437a52ec88f955d77e211c969b2599b760d3996b2939d03" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.803508 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.806149 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" event={"ID":"25ccfe1a-c34c-4a15-900e-de61130599e5","Type":"ContainerDied","Data":"a57fedbb70d9a1517d63401c316552cfabd00ee84e2251c56002ec40860a105a"} Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.806327 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a57fedbb70d9a1517d63401c316552cfabd00ee84e2251c56002ec40860a105a" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.806185 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb5f7-account-delete-6jl4b" Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.808722 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77fe0cdf-9f81-47f7-b888-adaf749e562a","Type":"ContainerStarted","Data":"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae"} Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.880895 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcherb5f7-account-delete-6jl4b"] Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.906139 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcherb5f7-account-delete-6jl4b"] Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.913650 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:17:53 crc kubenswrapper[4794]: I1215 14:17:53.919205 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.648687 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-lr55r"] Dec 15 14:17:54 crc kubenswrapper[4794]: E1215 14:17:54.649620 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86346c16-baaa-4a00-a83a-1a48ed91b7fd" containerName="watcher-applier" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.649711 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="86346c16-baaa-4a00-a83a-1a48ed91b7fd" containerName="watcher-applier" Dec 15 14:17:54 crc kubenswrapper[4794]: E1215 14:17:54.649820 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ccfe1a-c34c-4a15-900e-de61130599e5" containerName="mariadb-account-delete" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.649894 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ccfe1a-c34c-4a15-900e-de61130599e5" containerName="mariadb-account-delete" Dec 15 14:17:54 crc kubenswrapper[4794]: E1215 14:17:54.649990 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="232868d5-8a61-4490-b2bf-c177e6d4d2bf" containerName="watcher-decision-engine" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.650061 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="232868d5-8a61-4490-b2bf-c177e6d4d2bf" containerName="watcher-decision-engine" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.650528 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="232868d5-8a61-4490-b2bf-c177e6d4d2bf" containerName="watcher-decision-engine" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.650639 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ccfe1a-c34c-4a15-900e-de61130599e5" containerName="mariadb-account-delete" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.650742 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="86346c16-baaa-4a00-a83a-1a48ed91b7fd" containerName="watcher-applier" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.651496 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-lr55r" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.662013 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-lr55r"] Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.746315 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="232868d5-8a61-4490-b2bf-c177e6d4d2bf" path="/var/lib/kubelet/pods/232868d5-8a61-4490-b2bf-c177e6d4d2bf/volumes" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.746757 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ccfe1a-c34c-4a15-900e-de61130599e5" path="/var/lib/kubelet/pods/25ccfe1a-c34c-4a15-900e-de61130599e5/volumes" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.747160 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0" path="/var/lib/kubelet/pods/3ed50dff-8d8f-4c00-9fe1-7cadc5fbd6d0/volumes" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.747629 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f4d206-cdfb-4963-9fe6-14153f253725" path="/var/lib/kubelet/pods/49f4d206-cdfb-4963-9fe6-14153f253725/volumes" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.748545 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86346c16-baaa-4a00-a83a-1a48ed91b7fd" path="/var/lib/kubelet/pods/86346c16-baaa-4a00-a83a-1a48ed91b7fd/volumes" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.772442 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsfb9\" (UniqueName: \"kubernetes.io/projected/f790587d-a223-47c4-ba0d-79b0c2127f61-kube-api-access-bsfb9\") pod \"watcher-db-create-lr55r\" (UID: \"f790587d-a223-47c4-ba0d-79b0c2127f61\") " pod="watcher-kuttl-default/watcher-db-create-lr55r" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.822663 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77fe0cdf-9f81-47f7-b888-adaf749e562a","Type":"ContainerStarted","Data":"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893"} Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.823368 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77fe0cdf-9f81-47f7-b888-adaf749e562a","Type":"ContainerStarted","Data":"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8"} Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.842449 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.842504 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.874182 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsfb9\" (UniqueName: \"kubernetes.io/projected/f790587d-a223-47c4-ba0d-79b0c2127f61-kube-api-access-bsfb9\") pod \"watcher-db-create-lr55r\" (UID: \"f790587d-a223-47c4-ba0d-79b0c2127f61\") " pod="watcher-kuttl-default/watcher-db-create-lr55r" Dec 15 14:17:54 crc kubenswrapper[4794]: I1215 14:17:54.894167 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsfb9\" (UniqueName: \"kubernetes.io/projected/f790587d-a223-47c4-ba0d-79b0c2127f61-kube-api-access-bsfb9\") pod \"watcher-db-create-lr55r\" (UID: \"f790587d-a223-47c4-ba0d-79b0c2127f61\") " pod="watcher-kuttl-default/watcher-db-create-lr55r" Dec 15 14:17:55 crc kubenswrapper[4794]: I1215 14:17:55.011178 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-lr55r" Dec 15 14:17:55 crc kubenswrapper[4794]: I1215 14:17:55.503856 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-lr55r"] Dec 15 14:17:55 crc kubenswrapper[4794]: I1215 14:17:55.829874 4794 generic.go:334] "Generic (PLEG): container finished" podID="f790587d-a223-47c4-ba0d-79b0c2127f61" containerID="eca45b99c40e28cff944474b0e9f5460419af597bedfea01d5fdb79aaeb64dcf" exitCode=0 Dec 15 14:17:55 crc kubenswrapper[4794]: I1215 14:17:55.829926 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-lr55r" event={"ID":"f790587d-a223-47c4-ba0d-79b0c2127f61","Type":"ContainerDied","Data":"eca45b99c40e28cff944474b0e9f5460419af597bedfea01d5fdb79aaeb64dcf"} Dec 15 14:17:55 crc kubenswrapper[4794]: I1215 14:17:55.829978 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-lr55r" event={"ID":"f790587d-a223-47c4-ba0d-79b0c2127f61","Type":"ContainerStarted","Data":"217b214a4b8891a878ba0afa8cbb0c9e129987718ea74effa7548c9000971dff"} Dec 15 14:17:55 crc kubenswrapper[4794]: I1215 14:17:55.888171 4794 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mvmxr" podUID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerName="registry-server" probeResult="failure" output=< Dec 15 14:17:55 crc kubenswrapper[4794]: timeout: failed to connect service ":50051" within 1s Dec 15 14:17:55 crc kubenswrapper[4794]: > Dec 15 14:17:56 crc kubenswrapper[4794]: I1215 14:17:56.852298 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77fe0cdf-9f81-47f7-b888-adaf749e562a","Type":"ContainerStarted","Data":"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1"} Dec 15 14:17:56 crc kubenswrapper[4794]: I1215 14:17:56.852411 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="ceilometer-central-agent" containerID="cri-o://56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae" gracePeriod=30 Dec 15 14:17:56 crc kubenswrapper[4794]: I1215 14:17:56.852470 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="sg-core" containerID="cri-o://fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893" gracePeriod=30 Dec 15 14:17:56 crc kubenswrapper[4794]: I1215 14:17:56.852864 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:56 crc kubenswrapper[4794]: I1215 14:17:56.852526 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="ceilometer-notification-agent" containerID="cri-o://78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8" gracePeriod=30 Dec 15 14:17:56 crc kubenswrapper[4794]: I1215 14:17:56.852500 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="proxy-httpd" containerID="cri-o://b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1" gracePeriod=30 Dec 15 14:17:56 crc kubenswrapper[4794]: I1215 14:17:56.880541 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.58364547 podStartE2EDuration="5.880512378s" podCreationTimestamp="2025-12-15 14:17:51 +0000 UTC" firstStartedPulling="2025-12-15 14:17:52.666345007 +0000 UTC m=+1434.518367445" lastFinishedPulling="2025-12-15 14:17:55.963211915 +0000 UTC m=+1437.815234353" observedRunningTime="2025-12-15 14:17:56.875106455 +0000 UTC m=+1438.727128893" watchObservedRunningTime="2025-12-15 14:17:56.880512378 +0000 UTC m=+1438.732534826" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.414199 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-lr55r" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.518815 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsfb9\" (UniqueName: \"kubernetes.io/projected/f790587d-a223-47c4-ba0d-79b0c2127f61-kube-api-access-bsfb9\") pod \"f790587d-a223-47c4-ba0d-79b0c2127f61\" (UID: \"f790587d-a223-47c4-ba0d-79b0c2127f61\") " Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.524123 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f790587d-a223-47c4-ba0d-79b0c2127f61-kube-api-access-bsfb9" (OuterVolumeSpecName: "kube-api-access-bsfb9") pod "f790587d-a223-47c4-ba0d-79b0c2127f61" (UID: "f790587d-a223-47c4-ba0d-79b0c2127f61"). InnerVolumeSpecName "kube-api-access-bsfb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.621110 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsfb9\" (UniqueName: \"kubernetes.io/projected/f790587d-a223-47c4-ba0d-79b0c2127f61-kube-api-access-bsfb9\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.693041 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.824030 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-scripts\") pod \"77fe0cdf-9f81-47f7-b888-adaf749e562a\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.824067 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-sg-core-conf-yaml\") pod \"77fe0cdf-9f81-47f7-b888-adaf749e562a\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.824184 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-run-httpd\") pod \"77fe0cdf-9f81-47f7-b888-adaf749e562a\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.824208 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-config-data\") pod \"77fe0cdf-9f81-47f7-b888-adaf749e562a\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.824224 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-combined-ca-bundle\") pod \"77fe0cdf-9f81-47f7-b888-adaf749e562a\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.824268 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m67r5\" (UniqueName: \"kubernetes.io/projected/77fe0cdf-9f81-47f7-b888-adaf749e562a-kube-api-access-m67r5\") pod \"77fe0cdf-9f81-47f7-b888-adaf749e562a\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.824302 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-log-httpd\") pod \"77fe0cdf-9f81-47f7-b888-adaf749e562a\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.824326 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-ceilometer-tls-certs\") pod \"77fe0cdf-9f81-47f7-b888-adaf749e562a\" (UID: \"77fe0cdf-9f81-47f7-b888-adaf749e562a\") " Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.824796 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "77fe0cdf-9f81-47f7-b888-adaf749e562a" (UID: "77fe0cdf-9f81-47f7-b888-adaf749e562a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.825187 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "77fe0cdf-9f81-47f7-b888-adaf749e562a" (UID: "77fe0cdf-9f81-47f7-b888-adaf749e562a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.829083 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77fe0cdf-9f81-47f7-b888-adaf749e562a-kube-api-access-m67r5" (OuterVolumeSpecName: "kube-api-access-m67r5") pod "77fe0cdf-9f81-47f7-b888-adaf749e562a" (UID: "77fe0cdf-9f81-47f7-b888-adaf749e562a"). InnerVolumeSpecName "kube-api-access-m67r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.829790 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-scripts" (OuterVolumeSpecName: "scripts") pod "77fe0cdf-9f81-47f7-b888-adaf749e562a" (UID: "77fe0cdf-9f81-47f7-b888-adaf749e562a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.846921 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "77fe0cdf-9f81-47f7-b888-adaf749e562a" (UID: "77fe0cdf-9f81-47f7-b888-adaf749e562a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.863345 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-lr55r" event={"ID":"f790587d-a223-47c4-ba0d-79b0c2127f61","Type":"ContainerDied","Data":"217b214a4b8891a878ba0afa8cbb0c9e129987718ea74effa7548c9000971dff"} Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.863455 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="217b214a4b8891a878ba0afa8cbb0c9e129987718ea74effa7548c9000971dff" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.863627 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-lr55r" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.866726 4794 generic.go:334] "Generic (PLEG): container finished" podID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerID="b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1" exitCode=0 Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.866755 4794 generic.go:334] "Generic (PLEG): container finished" podID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerID="fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893" exitCode=2 Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.866763 4794 generic.go:334] "Generic (PLEG): container finished" podID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerID="78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8" exitCode=0 Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.866770 4794 generic.go:334] "Generic (PLEG): container finished" podID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerID="56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae" exitCode=0 Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.866791 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77fe0cdf-9f81-47f7-b888-adaf749e562a","Type":"ContainerDied","Data":"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1"} Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.866819 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77fe0cdf-9f81-47f7-b888-adaf749e562a","Type":"ContainerDied","Data":"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893"} Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.866829 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77fe0cdf-9f81-47f7-b888-adaf749e562a","Type":"ContainerDied","Data":"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8"} Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.866839 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77fe0cdf-9f81-47f7-b888-adaf749e562a","Type":"ContainerDied","Data":"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae"} Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.866848 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"77fe0cdf-9f81-47f7-b888-adaf749e562a","Type":"ContainerDied","Data":"32e5ac44ff2057b26df37ede2a30e4a5597003fa99703b0343d23c8056d6709e"} Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.866863 4794 scope.go:117] "RemoveContainer" containerID="b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.866984 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.872736 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "77fe0cdf-9f81-47f7-b888-adaf749e562a" (UID: "77fe0cdf-9f81-47f7-b888-adaf749e562a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.898829 4794 scope.go:117] "RemoveContainer" containerID="fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.915779 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77fe0cdf-9f81-47f7-b888-adaf749e562a" (UID: "77fe0cdf-9f81-47f7-b888-adaf749e562a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.916416 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-config-data" (OuterVolumeSpecName: "config-data") pod "77fe0cdf-9f81-47f7-b888-adaf749e562a" (UID: "77fe0cdf-9f81-47f7-b888-adaf749e562a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.921888 4794 scope.go:117] "RemoveContainer" containerID="78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.926288 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.926314 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.926339 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.926351 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m67r5\" (UniqueName: \"kubernetes.io/projected/77fe0cdf-9f81-47f7-b888-adaf749e562a-kube-api-access-m67r5\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.926359 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77fe0cdf-9f81-47f7-b888-adaf749e562a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.926368 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.926381 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.926390 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77fe0cdf-9f81-47f7-b888-adaf749e562a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.952973 4794 scope.go:117] "RemoveContainer" containerID="56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.980079 4794 scope.go:117] "RemoveContainer" containerID="b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1" Dec 15 14:17:57 crc kubenswrapper[4794]: E1215 14:17:57.980643 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1\": container with ID starting with b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1 not found: ID does not exist" containerID="b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.980737 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1"} err="failed to get container status \"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1\": rpc error: code = NotFound desc = could not find container \"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1\": container with ID starting with b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.980818 4794 scope.go:117] "RemoveContainer" containerID="fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893" Dec 15 14:17:57 crc kubenswrapper[4794]: E1215 14:17:57.981185 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893\": container with ID starting with fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893 not found: ID does not exist" containerID="fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.982752 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893"} err="failed to get container status \"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893\": rpc error: code = NotFound desc = could not find container \"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893\": container with ID starting with fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.982825 4794 scope.go:117] "RemoveContainer" containerID="78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8" Dec 15 14:17:57 crc kubenswrapper[4794]: E1215 14:17:57.983154 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8\": container with ID starting with 78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8 not found: ID does not exist" containerID="78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.983210 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8"} err="failed to get container status \"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8\": rpc error: code = NotFound desc = could not find container \"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8\": container with ID starting with 78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.983245 4794 scope.go:117] "RemoveContainer" containerID="56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae" Dec 15 14:17:57 crc kubenswrapper[4794]: E1215 14:17:57.983638 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae\": container with ID starting with 56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae not found: ID does not exist" containerID="56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.983670 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae"} err="failed to get container status \"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae\": rpc error: code = NotFound desc = could not find container \"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae\": container with ID starting with 56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.983689 4794 scope.go:117] "RemoveContainer" containerID="b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.984054 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1"} err="failed to get container status \"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1\": rpc error: code = NotFound desc = could not find container \"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1\": container with ID starting with b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.984079 4794 scope.go:117] "RemoveContainer" containerID="fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.984300 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893"} err="failed to get container status \"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893\": rpc error: code = NotFound desc = could not find container \"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893\": container with ID starting with fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.984325 4794 scope.go:117] "RemoveContainer" containerID="78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.984663 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8"} err="failed to get container status \"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8\": rpc error: code = NotFound desc = could not find container \"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8\": container with ID starting with 78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.984687 4794 scope.go:117] "RemoveContainer" containerID="56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.985031 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae"} err="failed to get container status \"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae\": rpc error: code = NotFound desc = could not find container \"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae\": container with ID starting with 56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.985062 4794 scope.go:117] "RemoveContainer" containerID="b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.985339 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1"} err="failed to get container status \"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1\": rpc error: code = NotFound desc = could not find container \"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1\": container with ID starting with b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.985359 4794 scope.go:117] "RemoveContainer" containerID="fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.985627 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893"} err="failed to get container status \"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893\": rpc error: code = NotFound desc = could not find container \"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893\": container with ID starting with fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.985655 4794 scope.go:117] "RemoveContainer" containerID="78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.985877 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8"} err="failed to get container status \"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8\": rpc error: code = NotFound desc = could not find container \"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8\": container with ID starting with 78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.985902 4794 scope.go:117] "RemoveContainer" containerID="56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.986071 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae"} err="failed to get container status \"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae\": rpc error: code = NotFound desc = could not find container \"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae\": container with ID starting with 56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.986095 4794 scope.go:117] "RemoveContainer" containerID="b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.986310 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1"} err="failed to get container status \"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1\": rpc error: code = NotFound desc = could not find container \"b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1\": container with ID starting with b7d4624e79389d07c722b100afcd2741eb5876bba8ee7c5de5a4c089257564b1 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.986335 4794 scope.go:117] "RemoveContainer" containerID="fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.986556 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893"} err="failed to get container status \"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893\": rpc error: code = NotFound desc = could not find container \"fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893\": container with ID starting with fb9a4db8fcb7697e114a4938eb232c86206e97ffe701f1d13bbf1a72acfb9893 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.986578 4794 scope.go:117] "RemoveContainer" containerID="78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.986795 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8"} err="failed to get container status \"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8\": rpc error: code = NotFound desc = could not find container \"78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8\": container with ID starting with 78acbc721425dd4b5561f6daa4e4cd3aff584a3695d90d89ae0c6c99cb81d0c8 not found: ID does not exist" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.986813 4794 scope.go:117] "RemoveContainer" containerID="56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae" Dec 15 14:17:57 crc kubenswrapper[4794]: I1215 14:17:57.987078 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae"} err="failed to get container status \"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae\": rpc error: code = NotFound desc = could not find container \"56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae\": container with ID starting with 56beb266db57032c8f1fa281cb70c1187700b6b8db8456aaec8c88c418007fae not found: ID does not exist" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.200889 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.208394 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.228112 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:58 crc kubenswrapper[4794]: E1215 14:17:58.228454 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f790587d-a223-47c4-ba0d-79b0c2127f61" containerName="mariadb-database-create" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.228473 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f790587d-a223-47c4-ba0d-79b0c2127f61" containerName="mariadb-database-create" Dec 15 14:17:58 crc kubenswrapper[4794]: E1215 14:17:58.228497 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="proxy-httpd" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.228508 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="proxy-httpd" Dec 15 14:17:58 crc kubenswrapper[4794]: E1215 14:17:58.228524 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="sg-core" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.228533 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="sg-core" Dec 15 14:17:58 crc kubenswrapper[4794]: E1215 14:17:58.228546 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="ceilometer-notification-agent" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.228554 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="ceilometer-notification-agent" Dec 15 14:17:58 crc kubenswrapper[4794]: E1215 14:17:58.228568 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="ceilometer-central-agent" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.228577 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="ceilometer-central-agent" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.228797 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="sg-core" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.228818 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="ceilometer-notification-agent" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.228832 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="proxy-httpd" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.228847 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" containerName="ceilometer-central-agent" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.228859 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f790587d-a223-47c4-ba0d-79b0c2127f61" containerName="mariadb-database-create" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.230794 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.234191 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.234347 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.234437 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.253461 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.331061 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kncf\" (UniqueName: \"kubernetes.io/projected/10c3a22d-52a8-43ff-ad37-238331ecd3e2-kube-api-access-6kncf\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.331182 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.331248 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-run-httpd\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.331280 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.331309 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.331379 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-log-httpd\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.331415 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-config-data\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.331609 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-scripts\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.433143 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.433185 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-run-httpd\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.433214 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.433252 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.433288 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-log-httpd\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.433316 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-config-data\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.433375 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-scripts\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.433413 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kncf\" (UniqueName: \"kubernetes.io/projected/10c3a22d-52a8-43ff-ad37-238331ecd3e2-kube-api-access-6kncf\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.433719 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-run-httpd\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.433769 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-log-httpd\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.437110 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.437139 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-scripts\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.437963 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-config-data\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.440203 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.441619 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.458100 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kncf\" (UniqueName: \"kubernetes.io/projected/10c3a22d-52a8-43ff-ad37-238331ecd3e2-kube-api-access-6kncf\") pod \"ceilometer-0\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.548255 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:17:58 crc kubenswrapper[4794]: I1215 14:17:58.750040 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77fe0cdf-9f81-47f7-b888-adaf749e562a" path="/var/lib/kubelet/pods/77fe0cdf-9f81-47f7-b888-adaf749e562a/volumes" Dec 15 14:17:59 crc kubenswrapper[4794]: I1215 14:17:59.008563 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:17:59 crc kubenswrapper[4794]: W1215 14:17:59.016541 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10c3a22d_52a8_43ff_ad37_238331ecd3e2.slice/crio-411aadbc9fb831c11ea13a9deea50c321e6830d2ce01477b0cafe817a8336942 WatchSource:0}: Error finding container 411aadbc9fb831c11ea13a9deea50c321e6830d2ce01477b0cafe817a8336942: Status 404 returned error can't find the container with id 411aadbc9fb831c11ea13a9deea50c321e6830d2ce01477b0cafe817a8336942 Dec 15 14:17:59 crc kubenswrapper[4794]: I1215 14:17:59.895943 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"10c3a22d-52a8-43ff-ad37-238331ecd3e2","Type":"ContainerStarted","Data":"eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c"} Dec 15 14:17:59 crc kubenswrapper[4794]: I1215 14:17:59.896663 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"10c3a22d-52a8-43ff-ad37-238331ecd3e2","Type":"ContainerStarted","Data":"411aadbc9fb831c11ea13a9deea50c321e6830d2ce01477b0cafe817a8336942"} Dec 15 14:18:01 crc kubenswrapper[4794]: I1215 14:18:01.911739 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"10c3a22d-52a8-43ff-ad37-238331ecd3e2","Type":"ContainerStarted","Data":"e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208"} Dec 15 14:18:01 crc kubenswrapper[4794]: I1215 14:18:01.912394 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"10c3a22d-52a8-43ff-ad37-238331ecd3e2","Type":"ContainerStarted","Data":"7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5"} Dec 15 14:18:03 crc kubenswrapper[4794]: I1215 14:18:03.937420 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"10c3a22d-52a8-43ff-ad37-238331ecd3e2","Type":"ContainerStarted","Data":"462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd"} Dec 15 14:18:03 crc kubenswrapper[4794]: I1215 14:18:03.939887 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:03 crc kubenswrapper[4794]: I1215 14:18:03.977762 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.03467002 podStartE2EDuration="5.977743217s" podCreationTimestamp="2025-12-15 14:17:58 +0000 UTC" firstStartedPulling="2025-12-15 14:17:59.018295307 +0000 UTC m=+1440.870317765" lastFinishedPulling="2025-12-15 14:18:02.961368504 +0000 UTC m=+1444.813390962" observedRunningTime="2025-12-15 14:18:03.973926639 +0000 UTC m=+1445.825949107" watchObservedRunningTime="2025-12-15 14:18:03.977743217 +0000 UTC m=+1445.829765675" Dec 15 14:18:04 crc kubenswrapper[4794]: I1215 14:18:04.646823 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-70d5-account-create-5cvz7"] Dec 15 14:18:04 crc kubenswrapper[4794]: I1215 14:18:04.648121 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-70d5-account-create-5cvz7" Dec 15 14:18:04 crc kubenswrapper[4794]: I1215 14:18:04.651813 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 15 14:18:04 crc kubenswrapper[4794]: I1215 14:18:04.656473 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-70d5-account-create-5cvz7"] Dec 15 14:18:04 crc kubenswrapper[4794]: I1215 14:18:04.736388 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2xts\" (UniqueName: \"kubernetes.io/projected/8bb5ad5d-c780-4b1d-95c1-784ce8850044-kube-api-access-h2xts\") pod \"watcher-70d5-account-create-5cvz7\" (UID: \"8bb5ad5d-c780-4b1d-95c1-784ce8850044\") " pod="watcher-kuttl-default/watcher-70d5-account-create-5cvz7" Dec 15 14:18:04 crc kubenswrapper[4794]: I1215 14:18:04.838388 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2xts\" (UniqueName: \"kubernetes.io/projected/8bb5ad5d-c780-4b1d-95c1-784ce8850044-kube-api-access-h2xts\") pod \"watcher-70d5-account-create-5cvz7\" (UID: \"8bb5ad5d-c780-4b1d-95c1-784ce8850044\") " pod="watcher-kuttl-default/watcher-70d5-account-create-5cvz7" Dec 15 14:18:04 crc kubenswrapper[4794]: I1215 14:18:04.862903 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2xts\" (UniqueName: \"kubernetes.io/projected/8bb5ad5d-c780-4b1d-95c1-784ce8850044-kube-api-access-h2xts\") pod \"watcher-70d5-account-create-5cvz7\" (UID: \"8bb5ad5d-c780-4b1d-95c1-784ce8850044\") " pod="watcher-kuttl-default/watcher-70d5-account-create-5cvz7" Dec 15 14:18:04 crc kubenswrapper[4794]: I1215 14:18:04.894799 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:18:04 crc kubenswrapper[4794]: I1215 14:18:04.963515 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:18:04 crc kubenswrapper[4794]: I1215 14:18:04.995535 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-70d5-account-create-5cvz7" Dec 15 14:18:05 crc kubenswrapper[4794]: I1215 14:18:05.472264 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-70d5-account-create-5cvz7"] Dec 15 14:18:05 crc kubenswrapper[4794]: I1215 14:18:05.963249 4794 generic.go:334] "Generic (PLEG): container finished" podID="8bb5ad5d-c780-4b1d-95c1-784ce8850044" containerID="05f8e2e23f7ae367489dc44d588c2ebf5dd149eb3e3b474358ad41e06e966d69" exitCode=0 Dec 15 14:18:05 crc kubenswrapper[4794]: I1215 14:18:05.965274 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-70d5-account-create-5cvz7" event={"ID":"8bb5ad5d-c780-4b1d-95c1-784ce8850044","Type":"ContainerDied","Data":"05f8e2e23f7ae367489dc44d588c2ebf5dd149eb3e3b474358ad41e06e966d69"} Dec 15 14:18:05 crc kubenswrapper[4794]: I1215 14:18:05.965334 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-70d5-account-create-5cvz7" event={"ID":"8bb5ad5d-c780-4b1d-95c1-784ce8850044","Type":"ContainerStarted","Data":"58ae957ac2febc522f3c13007475b56907c77995a0dc088ea9cc4fa24ce01ed0"} Dec 15 14:18:07 crc kubenswrapper[4794]: I1215 14:18:07.417902 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-70d5-account-create-5cvz7" Dec 15 14:18:07 crc kubenswrapper[4794]: I1215 14:18:07.483175 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2xts\" (UniqueName: \"kubernetes.io/projected/8bb5ad5d-c780-4b1d-95c1-784ce8850044-kube-api-access-h2xts\") pod \"8bb5ad5d-c780-4b1d-95c1-784ce8850044\" (UID: \"8bb5ad5d-c780-4b1d-95c1-784ce8850044\") " Dec 15 14:18:07 crc kubenswrapper[4794]: I1215 14:18:07.488675 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb5ad5d-c780-4b1d-95c1-784ce8850044-kube-api-access-h2xts" (OuterVolumeSpecName: "kube-api-access-h2xts") pod "8bb5ad5d-c780-4b1d-95c1-784ce8850044" (UID: "8bb5ad5d-c780-4b1d-95c1-784ce8850044"). InnerVolumeSpecName "kube-api-access-h2xts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:07 crc kubenswrapper[4794]: I1215 14:18:07.585864 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2xts\" (UniqueName: \"kubernetes.io/projected/8bb5ad5d-c780-4b1d-95c1-784ce8850044-kube-api-access-h2xts\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:08 crc kubenswrapper[4794]: I1215 14:18:08.020642 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-70d5-account-create-5cvz7" event={"ID":"8bb5ad5d-c780-4b1d-95c1-784ce8850044","Type":"ContainerDied","Data":"58ae957ac2febc522f3c13007475b56907c77995a0dc088ea9cc4fa24ce01ed0"} Dec 15 14:18:08 crc kubenswrapper[4794]: I1215 14:18:08.020698 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58ae957ac2febc522f3c13007475b56907c77995a0dc088ea9cc4fa24ce01ed0" Dec 15 14:18:08 crc kubenswrapper[4794]: I1215 14:18:08.020777 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-70d5-account-create-5cvz7" Dec 15 14:18:08 crc kubenswrapper[4794]: I1215 14:18:08.469830 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvmxr"] Dec 15 14:18:08 crc kubenswrapper[4794]: I1215 14:18:08.470256 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mvmxr" podUID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerName="registry-server" containerID="cri-o://c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b" gracePeriod=2 Dec 15 14:18:08 crc kubenswrapper[4794]: I1215 14:18:08.923533 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.007417 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmfp6\" (UniqueName: \"kubernetes.io/projected/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-kube-api-access-jmfp6\") pod \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.007534 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-catalog-content\") pod \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.007652 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-utilities\") pod \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\" (UID: \"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5\") " Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.008943 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-utilities" (OuterVolumeSpecName: "utilities") pod "ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" (UID: "ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.014009 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-kube-api-access-jmfp6" (OuterVolumeSpecName: "kube-api-access-jmfp6") pod "ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" (UID: "ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5"). InnerVolumeSpecName "kube-api-access-jmfp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.033765 4794 generic.go:334] "Generic (PLEG): container finished" podID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerID="c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b" exitCode=0 Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.033808 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvmxr" event={"ID":"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5","Type":"ContainerDied","Data":"c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b"} Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.033847 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvmxr" event={"ID":"ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5","Type":"ContainerDied","Data":"eb3de9fd5404934f668b4e824b0e0d95c6d0bcf14421c788ff19580388c533fc"} Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.033863 4794 scope.go:117] "RemoveContainer" containerID="c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.034413 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvmxr" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.071532 4794 scope.go:117] "RemoveContainer" containerID="47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.107254 4794 scope.go:117] "RemoveContainer" containerID="bf2b48d21e0f644c85f5a720ca36e0fe3b0593276e3a66d61ee9eaea92924d43" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.108984 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmfp6\" (UniqueName: \"kubernetes.io/projected/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-kube-api-access-jmfp6\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.109002 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.145869 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" (UID: "ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.148724 4794 scope.go:117] "RemoveContainer" containerID="c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b" Dec 15 14:18:09 crc kubenswrapper[4794]: E1215 14:18:09.149056 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b\": container with ID starting with c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b not found: ID does not exist" containerID="c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.149083 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b"} err="failed to get container status \"c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b\": rpc error: code = NotFound desc = could not find container \"c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b\": container with ID starting with c1cad75b8fee56d3f38b04e4f13bb4c568435b9509d18398d0a97350e2a74f4b not found: ID does not exist" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.149103 4794 scope.go:117] "RemoveContainer" containerID="47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1" Dec 15 14:18:09 crc kubenswrapper[4794]: E1215 14:18:09.149369 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1\": container with ID starting with 47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1 not found: ID does not exist" containerID="47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.149393 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1"} err="failed to get container status \"47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1\": rpc error: code = NotFound desc = could not find container \"47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1\": container with ID starting with 47d12a0f5af0a13dd83f78e58a0e5a9188861ee5a5ae95dae2bfb5fb54667ca1 not found: ID does not exist" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.149408 4794 scope.go:117] "RemoveContainer" containerID="bf2b48d21e0f644c85f5a720ca36e0fe3b0593276e3a66d61ee9eaea92924d43" Dec 15 14:18:09 crc kubenswrapper[4794]: E1215 14:18:09.150704 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf2b48d21e0f644c85f5a720ca36e0fe3b0593276e3a66d61ee9eaea92924d43\": container with ID starting with bf2b48d21e0f644c85f5a720ca36e0fe3b0593276e3a66d61ee9eaea92924d43 not found: ID does not exist" containerID="bf2b48d21e0f644c85f5a720ca36e0fe3b0593276e3a66d61ee9eaea92924d43" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.150768 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf2b48d21e0f644c85f5a720ca36e0fe3b0593276e3a66d61ee9eaea92924d43"} err="failed to get container status \"bf2b48d21e0f644c85f5a720ca36e0fe3b0593276e3a66d61ee9eaea92924d43\": rpc error: code = NotFound desc = could not find container \"bf2b48d21e0f644c85f5a720ca36e0fe3b0593276e3a66d61ee9eaea92924d43\": container with ID starting with bf2b48d21e0f644c85f5a720ca36e0fe3b0593276e3a66d61ee9eaea92924d43 not found: ID does not exist" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.210282 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.383404 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvmxr"] Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.389353 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mvmxr"] Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.993712 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2"] Dec 15 14:18:09 crc kubenswrapper[4794]: E1215 14:18:09.994341 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerName="extract-content" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.994356 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerName="extract-content" Dec 15 14:18:09 crc kubenswrapper[4794]: E1215 14:18:09.994375 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerName="registry-server" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.994384 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerName="registry-server" Dec 15 14:18:09 crc kubenswrapper[4794]: E1215 14:18:09.994403 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerName="extract-utilities" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.994412 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerName="extract-utilities" Dec 15 14:18:09 crc kubenswrapper[4794]: E1215 14:18:09.994429 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb5ad5d-c780-4b1d-95c1-784ce8850044" containerName="mariadb-account-create" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.994437 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb5ad5d-c780-4b1d-95c1-784ce8850044" containerName="mariadb-account-create" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.994662 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" containerName="registry-server" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.994686 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb5ad5d-c780-4b1d-95c1-784ce8850044" containerName="mariadb-account-create" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.995326 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.998474 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-k6t5j" Dec 15 14:18:09 crc kubenswrapper[4794]: I1215 14:18:09.999273 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.009619 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2"] Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.026780 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r868\" (UniqueName: \"kubernetes.io/projected/ea336596-4d0f-4ca5-982f-bcefbe49f545-kube-api-access-9r868\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.026912 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.027144 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-db-sync-config-data\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.027208 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-config-data\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.129283 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.129404 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-db-sync-config-data\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.129448 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-config-data\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.129481 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r868\" (UniqueName: \"kubernetes.io/projected/ea336596-4d0f-4ca5-982f-bcefbe49f545-kube-api-access-9r868\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.133654 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.134432 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-config-data\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.136051 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-db-sync-config-data\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.149463 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r868\" (UniqueName: \"kubernetes.io/projected/ea336596-4d0f-4ca5-982f-bcefbe49f545-kube-api-access-9r868\") pod \"watcher-kuttl-db-sync-vr8j2\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.310945 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.754126 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5" path="/var/lib/kubelet/pods/ab7ed8fb-6c45-4bdc-a14a-9da38f1dddc5/volumes" Dec 15 14:18:10 crc kubenswrapper[4794]: I1215 14:18:10.783723 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2"] Dec 15 14:18:10 crc kubenswrapper[4794]: W1215 14:18:10.799824 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea336596_4d0f_4ca5_982f_bcefbe49f545.slice/crio-b9d826329fe20cd38d62a6abb55466a29bb6604211e4a1f92192736653bc0a1f WatchSource:0}: Error finding container b9d826329fe20cd38d62a6abb55466a29bb6604211e4a1f92192736653bc0a1f: Status 404 returned error can't find the container with id b9d826329fe20cd38d62a6abb55466a29bb6604211e4a1f92192736653bc0a1f Dec 15 14:18:11 crc kubenswrapper[4794]: I1215 14:18:11.050648 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" event={"ID":"ea336596-4d0f-4ca5-982f-bcefbe49f545","Type":"ContainerStarted","Data":"fd515d84d675aac17d39d17a5e36446635ab6395ad11067065d37e2706481757"} Dec 15 14:18:11 crc kubenswrapper[4794]: I1215 14:18:11.050690 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" event={"ID":"ea336596-4d0f-4ca5-982f-bcefbe49f545","Type":"ContainerStarted","Data":"b9d826329fe20cd38d62a6abb55466a29bb6604211e4a1f92192736653bc0a1f"} Dec 15 14:18:11 crc kubenswrapper[4794]: I1215 14:18:11.070282 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" podStartSLOduration=2.070264714 podStartE2EDuration="2.070264714s" podCreationTimestamp="2025-12-15 14:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:18:11.065732566 +0000 UTC m=+1452.917755004" watchObservedRunningTime="2025-12-15 14:18:11.070264714 +0000 UTC m=+1452.922287142" Dec 15 14:18:14 crc kubenswrapper[4794]: I1215 14:18:14.103829 4794 generic.go:334] "Generic (PLEG): container finished" podID="ea336596-4d0f-4ca5-982f-bcefbe49f545" containerID="fd515d84d675aac17d39d17a5e36446635ab6395ad11067065d37e2706481757" exitCode=0 Dec 15 14:18:14 crc kubenswrapper[4794]: I1215 14:18:14.103918 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" event={"ID":"ea336596-4d0f-4ca5-982f-bcefbe49f545","Type":"ContainerDied","Data":"fd515d84d675aac17d39d17a5e36446635ab6395ad11067065d37e2706481757"} Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.439745 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.622652 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-db-sync-config-data\") pod \"ea336596-4d0f-4ca5-982f-bcefbe49f545\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.622760 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-combined-ca-bundle\") pod \"ea336596-4d0f-4ca5-982f-bcefbe49f545\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.622820 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-config-data\") pod \"ea336596-4d0f-4ca5-982f-bcefbe49f545\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.622874 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r868\" (UniqueName: \"kubernetes.io/projected/ea336596-4d0f-4ca5-982f-bcefbe49f545-kube-api-access-9r868\") pod \"ea336596-4d0f-4ca5-982f-bcefbe49f545\" (UID: \"ea336596-4d0f-4ca5-982f-bcefbe49f545\") " Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.628923 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea336596-4d0f-4ca5-982f-bcefbe49f545-kube-api-access-9r868" (OuterVolumeSpecName: "kube-api-access-9r868") pod "ea336596-4d0f-4ca5-982f-bcefbe49f545" (UID: "ea336596-4d0f-4ca5-982f-bcefbe49f545"). InnerVolumeSpecName "kube-api-access-9r868". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.629621 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ea336596-4d0f-4ca5-982f-bcefbe49f545" (UID: "ea336596-4d0f-4ca5-982f-bcefbe49f545"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.666220 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea336596-4d0f-4ca5-982f-bcefbe49f545" (UID: "ea336596-4d0f-4ca5-982f-bcefbe49f545"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.690039 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-config-data" (OuterVolumeSpecName: "config-data") pod "ea336596-4d0f-4ca5-982f-bcefbe49f545" (UID: "ea336596-4d0f-4ca5-982f-bcefbe49f545"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.725048 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.725101 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.725113 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea336596-4d0f-4ca5-982f-bcefbe49f545-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:15 crc kubenswrapper[4794]: I1215 14:18:15.725124 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r868\" (UniqueName: \"kubernetes.io/projected/ea336596-4d0f-4ca5-982f-bcefbe49f545-kube-api-access-9r868\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.125629 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" event={"ID":"ea336596-4d0f-4ca5-982f-bcefbe49f545","Type":"ContainerDied","Data":"b9d826329fe20cd38d62a6abb55466a29bb6604211e4a1f92192736653bc0a1f"} Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.125701 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9d826329fe20cd38d62a6abb55466a29bb6604211e4a1f92192736653bc0a1f" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.125652 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.504618 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:18:16 crc kubenswrapper[4794]: E1215 14:18:16.504961 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea336596-4d0f-4ca5-982f-bcefbe49f545" containerName="watcher-kuttl-db-sync" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.504975 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea336596-4d0f-4ca5-982f-bcefbe49f545" containerName="watcher-kuttl-db-sync" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.505135 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea336596-4d0f-4ca5-982f-bcefbe49f545" containerName="watcher-kuttl-db-sync" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.505681 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.507565 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-k6t5j" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.507753 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.514170 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.515557 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.517280 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.517393 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.518071 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.525281 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.544318 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.585774 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.587684 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.590295 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.617906 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.643810 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rlld\" (UniqueName: \"kubernetes.io/projected/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-kube-api-access-8rlld\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.643875 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chv2z\" (UniqueName: \"kubernetes.io/projected/1c2768c9-bf11-4346-ab55-5e9a95f65278-kube-api-access-chv2z\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.644082 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.644244 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.644289 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.644387 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.644436 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-logs\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.644476 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.644504 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.644550 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.644808 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c2768c9-bf11-4346-ab55-5e9a95f65278-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.745874 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.745939 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-logs\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.745993 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746017 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746046 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564983a5-b527-4907-ad6f-790afad9a87c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746082 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746111 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746166 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c2768c9-bf11-4346-ab55-5e9a95f65278-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746203 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rlld\" (UniqueName: \"kubernetes.io/projected/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-kube-api-access-8rlld\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746231 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx897\" (UniqueName: \"kubernetes.io/projected/564983a5-b527-4907-ad6f-790afad9a87c-kube-api-access-sx897\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746264 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chv2z\" (UniqueName: \"kubernetes.io/projected/1c2768c9-bf11-4346-ab55-5e9a95f65278-kube-api-access-chv2z\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746298 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746325 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746356 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.746873 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c2768c9-bf11-4346-ab55-5e9a95f65278-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.747188 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.747282 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.747965 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-logs\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.750700 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.751913 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.752408 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.753179 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.754247 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.756395 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.760405 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.764238 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rlld\" (UniqueName: \"kubernetes.io/projected/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-kube-api-access-8rlld\") pod \"watcher-kuttl-api-0\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.778275 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chv2z\" (UniqueName: \"kubernetes.io/projected/1c2768c9-bf11-4346-ab55-5e9a95f65278-kube-api-access-chv2z\") pod \"watcher-kuttl-applier-0\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.824274 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.837986 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.848668 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564983a5-b527-4907-ad6f-790afad9a87c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.848749 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.848893 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx897\" (UniqueName: \"kubernetes.io/projected/564983a5-b527-4907-ad6f-790afad9a87c-kube-api-access-sx897\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.849003 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.849047 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.850792 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564983a5-b527-4907-ad6f-790afad9a87c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.856762 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.857510 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.858151 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.876196 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx897\" (UniqueName: \"kubernetes.io/projected/564983a5-b527-4907-ad6f-790afad9a87c-kube-api-access-sx897\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:16 crc kubenswrapper[4794]: I1215 14:18:16.909483 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:17 crc kubenswrapper[4794]: I1215 14:18:17.155404 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:18:17 crc kubenswrapper[4794]: I1215 14:18:17.280071 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:18:17 crc kubenswrapper[4794]: W1215 14:18:17.280693 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c2768c9_bf11_4346_ab55_5e9a95f65278.slice/crio-8b83eee1d05ff8fe7e6801a8c412725889babe4017053dd7bd346fbcae1d9d28 WatchSource:0}: Error finding container 8b83eee1d05ff8fe7e6801a8c412725889babe4017053dd7bd346fbcae1d9d28: Status 404 returned error can't find the container with id 8b83eee1d05ff8fe7e6801a8c412725889babe4017053dd7bd346fbcae1d9d28 Dec 15 14:18:17 crc kubenswrapper[4794]: W1215 14:18:17.419774 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod564983a5_b527_4907_ad6f_790afad9a87c.slice/crio-873ddb46f488e1b4a68245ba73f91ccb243024b22f6262ef5453296a5441a43d WatchSource:0}: Error finding container 873ddb46f488e1b4a68245ba73f91ccb243024b22f6262ef5453296a5441a43d: Status 404 returned error can't find the container with id 873ddb46f488e1b4a68245ba73f91ccb243024b22f6262ef5453296a5441a43d Dec 15 14:18:17 crc kubenswrapper[4794]: I1215 14:18:17.422637 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:18:18 crc kubenswrapper[4794]: I1215 14:18:18.144121 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c2768c9-bf11-4346-ab55-5e9a95f65278","Type":"ContainerStarted","Data":"dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4"} Dec 15 14:18:18 crc kubenswrapper[4794]: I1215 14:18:18.144454 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c2768c9-bf11-4346-ab55-5e9a95f65278","Type":"ContainerStarted","Data":"8b83eee1d05ff8fe7e6801a8c412725889babe4017053dd7bd346fbcae1d9d28"} Dec 15 14:18:18 crc kubenswrapper[4794]: I1215 14:18:18.146747 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4ef6b62b-a279-4abc-87fc-2a00610bb1aa","Type":"ContainerStarted","Data":"e8dbeb4ccbd4cf96c6f0bb20b4291e3b02ac96f2c66e8244631b2b9a0c59eab6"} Dec 15 14:18:18 crc kubenswrapper[4794]: I1215 14:18:18.146806 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4ef6b62b-a279-4abc-87fc-2a00610bb1aa","Type":"ContainerStarted","Data":"0a05a95e51359e88fa3a77d764a79df663fbcbb23ee683c895ee14f623114f81"} Dec 15 14:18:18 crc kubenswrapper[4794]: I1215 14:18:18.146827 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4ef6b62b-a279-4abc-87fc-2a00610bb1aa","Type":"ContainerStarted","Data":"98299c1f622969e94f20c825bb0b54c76a6dd0b2f1120b4f737e81f4dd03b511"} Dec 15 14:18:18 crc kubenswrapper[4794]: I1215 14:18:18.146958 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:18 crc kubenswrapper[4794]: I1215 14:18:18.150943 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"564983a5-b527-4907-ad6f-790afad9a87c","Type":"ContainerStarted","Data":"2cea1ba8b2811c243f8be19e131a68993662395f0db2cf5633b334462f695705"} Dec 15 14:18:18 crc kubenswrapper[4794]: I1215 14:18:18.150973 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"564983a5-b527-4907-ad6f-790afad9a87c","Type":"ContainerStarted","Data":"873ddb46f488e1b4a68245ba73f91ccb243024b22f6262ef5453296a5441a43d"} Dec 15 14:18:18 crc kubenswrapper[4794]: I1215 14:18:18.166664 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.16664125 podStartE2EDuration="2.16664125s" podCreationTimestamp="2025-12-15 14:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:18:18.16097906 +0000 UTC m=+1460.013001508" watchObservedRunningTime="2025-12-15 14:18:18.16664125 +0000 UTC m=+1460.018663708" Dec 15 14:18:18 crc kubenswrapper[4794]: I1215 14:18:18.191872 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.191852042 podStartE2EDuration="2.191852042s" podCreationTimestamp="2025-12-15 14:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:18:18.185723629 +0000 UTC m=+1460.037746067" watchObservedRunningTime="2025-12-15 14:18:18.191852042 +0000 UTC m=+1460.043874480" Dec 15 14:18:18 crc kubenswrapper[4794]: I1215 14:18:18.209682 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.209666416 podStartE2EDuration="2.209666416s" podCreationTimestamp="2025-12-15 14:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:18:18.205763606 +0000 UTC m=+1460.057786044" watchObservedRunningTime="2025-12-15 14:18:18.209666416 +0000 UTC m=+1460.061688854" Dec 15 14:18:20 crc kubenswrapper[4794]: I1215 14:18:20.465344 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:21 crc kubenswrapper[4794]: I1215 14:18:21.825152 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:21 crc kubenswrapper[4794]: I1215 14:18:21.838811 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:26 crc kubenswrapper[4794]: I1215 14:18:26.825510 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:26 crc kubenswrapper[4794]: I1215 14:18:26.839283 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:26 crc kubenswrapper[4794]: I1215 14:18:26.858348 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:26 crc kubenswrapper[4794]: I1215 14:18:26.871453 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:26 crc kubenswrapper[4794]: I1215 14:18:26.909902 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:26 crc kubenswrapper[4794]: I1215 14:18:26.945242 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:27 crc kubenswrapper[4794]: I1215 14:18:27.241844 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:27 crc kubenswrapper[4794]: I1215 14:18:27.262734 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:27 crc kubenswrapper[4794]: I1215 14:18:27.267195 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:27 crc kubenswrapper[4794]: I1215 14:18:27.282477 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:28 crc kubenswrapper[4794]: I1215 14:18:28.554640 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:29 crc kubenswrapper[4794]: I1215 14:18:29.493366 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:29 crc kubenswrapper[4794]: I1215 14:18:29.493740 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="ceilometer-central-agent" containerID="cri-o://eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c" gracePeriod=30 Dec 15 14:18:29 crc kubenswrapper[4794]: I1215 14:18:29.494380 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="proxy-httpd" containerID="cri-o://462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd" gracePeriod=30 Dec 15 14:18:29 crc kubenswrapper[4794]: I1215 14:18:29.494440 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="sg-core" containerID="cri-o://e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208" gracePeriod=30 Dec 15 14:18:29 crc kubenswrapper[4794]: I1215 14:18:29.494482 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="ceilometer-notification-agent" containerID="cri-o://7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5" gracePeriod=30 Dec 15 14:18:30 crc kubenswrapper[4794]: I1215 14:18:30.268322 4794 generic.go:334] "Generic (PLEG): container finished" podID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerID="462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd" exitCode=0 Dec 15 14:18:30 crc kubenswrapper[4794]: I1215 14:18:30.268681 4794 generic.go:334] "Generic (PLEG): container finished" podID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerID="e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208" exitCode=2 Dec 15 14:18:30 crc kubenswrapper[4794]: I1215 14:18:30.268694 4794 generic.go:334] "Generic (PLEG): container finished" podID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerID="eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c" exitCode=0 Dec 15 14:18:30 crc kubenswrapper[4794]: I1215 14:18:30.268424 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"10c3a22d-52a8-43ff-ad37-238331ecd3e2","Type":"ContainerDied","Data":"462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd"} Dec 15 14:18:30 crc kubenswrapper[4794]: I1215 14:18:30.268734 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"10c3a22d-52a8-43ff-ad37-238331ecd3e2","Type":"ContainerDied","Data":"e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208"} Dec 15 14:18:30 crc kubenswrapper[4794]: I1215 14:18:30.268752 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"10c3a22d-52a8-43ff-ad37-238331ecd3e2","Type":"ContainerDied","Data":"eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c"} Dec 15 14:18:30 crc kubenswrapper[4794]: I1215 14:18:30.978261 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2"] Dec 15 14:18:30 crc kubenswrapper[4794]: I1215 14:18:30.984036 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vr8j2"] Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.031241 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.031470 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="564983a5-b527-4907-ad6f-790afad9a87c" containerName="watcher-decision-engine" containerID="cri-o://2cea1ba8b2811c243f8be19e131a68993662395f0db2cf5633b334462f695705" gracePeriod=30 Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.078654 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher70d5-account-delete-chgmw"] Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.079762 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher70d5-account-delete-chgmw" Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.092285 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher70d5-account-delete-chgmw"] Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.121011 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-lr55r"] Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.133573 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.134113 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="1c2768c9-bf11-4346-ab55-5e9a95f65278" containerName="watcher-applier" containerID="cri-o://dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4" gracePeriod=30 Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.153786 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-70d5-account-create-5cvz7"] Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.164896 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-lr55r"] Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.182691 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher70d5-account-delete-chgmw"] Dec 15 14:18:31 crc kubenswrapper[4794]: E1215 14:18:31.183323 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-c7mkg], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watcher70d5-account-delete-chgmw" podUID="5ebf7cc8-b80f-432f-b1ad-275742da85a3" Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.197613 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-70d5-account-create-5cvz7"] Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.210079 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7mkg\" (UniqueName: \"kubernetes.io/projected/5ebf7cc8-b80f-432f-b1ad-275742da85a3-kube-api-access-c7mkg\") pod \"watcher70d5-account-delete-chgmw\" (UID: \"5ebf7cc8-b80f-432f-b1ad-275742da85a3\") " pod="watcher-kuttl-default/watcher70d5-account-delete-chgmw" Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.227992 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.228203 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerName="watcher-kuttl-api-log" containerID="cri-o://0a05a95e51359e88fa3a77d764a79df663fbcbb23ee683c895ee14f623114f81" gracePeriod=30 Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.228596 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerName="watcher-api" containerID="cri-o://e8dbeb4ccbd4cf96c6f0bb20b4291e3b02ac96f2c66e8244631b2b9a0c59eab6" gracePeriod=30 Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.274555 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher70d5-account-delete-chgmw" Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.284312 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher70d5-account-delete-chgmw" Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.312249 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7mkg\" (UniqueName: \"kubernetes.io/projected/5ebf7cc8-b80f-432f-b1ad-275742da85a3-kube-api-access-c7mkg\") pod \"watcher70d5-account-delete-chgmw\" (UID: \"5ebf7cc8-b80f-432f-b1ad-275742da85a3\") " pod="watcher-kuttl-default/watcher70d5-account-delete-chgmw" Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.341392 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7mkg\" (UniqueName: \"kubernetes.io/projected/5ebf7cc8-b80f-432f-b1ad-275742da85a3-kube-api-access-c7mkg\") pod \"watcher70d5-account-delete-chgmw\" (UID: \"5ebf7cc8-b80f-432f-b1ad-275742da85a3\") " pod="watcher-kuttl-default/watcher70d5-account-delete-chgmw" Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.514843 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7mkg\" (UniqueName: \"kubernetes.io/projected/5ebf7cc8-b80f-432f-b1ad-275742da85a3-kube-api-access-c7mkg\") pod \"5ebf7cc8-b80f-432f-b1ad-275742da85a3\" (UID: \"5ebf7cc8-b80f-432f-b1ad-275742da85a3\") " Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.528822 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ebf7cc8-b80f-432f-b1ad-275742da85a3-kube-api-access-c7mkg" (OuterVolumeSpecName: "kube-api-access-c7mkg") pod "5ebf7cc8-b80f-432f-b1ad-275742da85a3" (UID: "5ebf7cc8-b80f-432f-b1ad-275742da85a3"). InnerVolumeSpecName "kube-api-access-c7mkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:31 crc kubenswrapper[4794]: I1215 14:18:31.616852 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7mkg\" (UniqueName: \"kubernetes.io/projected/5ebf7cc8-b80f-432f-b1ad-275742da85a3-kube-api-access-c7mkg\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:31 crc kubenswrapper[4794]: E1215 14:18:31.845989 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:18:31 crc kubenswrapper[4794]: E1215 14:18:31.848228 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:18:31 crc kubenswrapper[4794]: E1215 14:18:31.849633 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:18:31 crc kubenswrapper[4794]: E1215 14:18:31.849662 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="1c2768c9-bf11-4346-ab55-5e9a95f65278" containerName="watcher-applier" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.048967 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.128925 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.149:9322/\": read tcp 10.217.0.2:58168->10.217.0.149:9322: read: connection reset by peer" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.129190 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"https://10.217.0.149:9322/\": read tcp 10.217.0.2:58174->10.217.0.149:9322: read: connection reset by peer" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.233918 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-run-httpd\") pod \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.234261 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-ceilometer-tls-certs\") pod \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.234338 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-log-httpd\") pod \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.234367 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-config-data\") pod \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.234417 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-sg-core-conf-yaml\") pod \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.234444 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kncf\" (UniqueName: \"kubernetes.io/projected/10c3a22d-52a8-43ff-ad37-238331ecd3e2-kube-api-access-6kncf\") pod \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.234497 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-combined-ca-bundle\") pod \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.234528 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-scripts\") pod \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\" (UID: \"10c3a22d-52a8-43ff-ad37-238331ecd3e2\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.234738 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "10c3a22d-52a8-43ff-ad37-238331ecd3e2" (UID: "10c3a22d-52a8-43ff-ad37-238331ecd3e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.235163 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.235448 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "10c3a22d-52a8-43ff-ad37-238331ecd3e2" (UID: "10c3a22d-52a8-43ff-ad37-238331ecd3e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.239732 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-scripts" (OuterVolumeSpecName: "scripts") pod "10c3a22d-52a8-43ff-ad37-238331ecd3e2" (UID: "10c3a22d-52a8-43ff-ad37-238331ecd3e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.241839 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c3a22d-52a8-43ff-ad37-238331ecd3e2-kube-api-access-6kncf" (OuterVolumeSpecName: "kube-api-access-6kncf") pod "10c3a22d-52a8-43ff-ad37-238331ecd3e2" (UID: "10c3a22d-52a8-43ff-ad37-238331ecd3e2"). InnerVolumeSpecName "kube-api-access-6kncf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.288308 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "10c3a22d-52a8-43ff-ad37-238331ecd3e2" (UID: "10c3a22d-52a8-43ff-ad37-238331ecd3e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.308785 4794 generic.go:334] "Generic (PLEG): container finished" podID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerID="7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5" exitCode=0 Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.308968 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.309342 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"10c3a22d-52a8-43ff-ad37-238331ecd3e2","Type":"ContainerDied","Data":"7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5"} Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.309368 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "10c3a22d-52a8-43ff-ad37-238331ecd3e2" (UID: "10c3a22d-52a8-43ff-ad37-238331ecd3e2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.309413 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"10c3a22d-52a8-43ff-ad37-238331ecd3e2","Type":"ContainerDied","Data":"411aadbc9fb831c11ea13a9deea50c321e6830d2ce01477b0cafe817a8336942"} Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.309440 4794 scope.go:117] "RemoveContainer" containerID="462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.314979 4794 generic.go:334] "Generic (PLEG): container finished" podID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerID="e8dbeb4ccbd4cf96c6f0bb20b4291e3b02ac96f2c66e8244631b2b9a0c59eab6" exitCode=0 Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.315006 4794 generic.go:334] "Generic (PLEG): container finished" podID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerID="0a05a95e51359e88fa3a77d764a79df663fbcbb23ee683c895ee14f623114f81" exitCode=143 Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.315015 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4ef6b62b-a279-4abc-87fc-2a00610bb1aa","Type":"ContainerDied","Data":"e8dbeb4ccbd4cf96c6f0bb20b4291e3b02ac96f2c66e8244631b2b9a0c59eab6"} Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.315063 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher70d5-account-delete-chgmw" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.315067 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4ef6b62b-a279-4abc-87fc-2a00610bb1aa","Type":"ContainerDied","Data":"0a05a95e51359e88fa3a77d764a79df663fbcbb23ee683c895ee14f623114f81"} Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.358075 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.358108 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10c3a22d-52a8-43ff-ad37-238331ecd3e2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.358379 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.358475 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kncf\" (UniqueName: \"kubernetes.io/projected/10c3a22d-52a8-43ff-ad37-238331ecd3e2-kube-api-access-6kncf\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.358493 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.367253 4794 scope.go:117] "RemoveContainer" containerID="e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.380734 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-config-data" (OuterVolumeSpecName: "config-data") pod "10c3a22d-52a8-43ff-ad37-238331ecd3e2" (UID: "10c3a22d-52a8-43ff-ad37-238331ecd3e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.396199 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10c3a22d-52a8-43ff-ad37-238331ecd3e2" (UID: "10c3a22d-52a8-43ff-ad37-238331ecd3e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.459908 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.459941 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c3a22d-52a8-43ff-ad37-238331ecd3e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.484192 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.485185 4794 scope.go:117] "RemoveContainer" containerID="7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.512852 4794 scope.go:117] "RemoveContainer" containerID="eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.523808 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher70d5-account-delete-chgmw"] Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.536853 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher70d5-account-delete-chgmw"] Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.538791 4794 scope.go:117] "RemoveContainer" containerID="462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd" Dec 15 14:18:32 crc kubenswrapper[4794]: E1215 14:18:32.539217 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd\": container with ID starting with 462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd not found: ID does not exist" containerID="462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.539246 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd"} err="failed to get container status \"462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd\": rpc error: code = NotFound desc = could not find container \"462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd\": container with ID starting with 462a852ab431c0f794482966d920aaf36bb00bd1ad38efa92a954f019cf37ffd not found: ID does not exist" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.539279 4794 scope.go:117] "RemoveContainer" containerID="e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208" Dec 15 14:18:32 crc kubenswrapper[4794]: E1215 14:18:32.539544 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208\": container with ID starting with e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208 not found: ID does not exist" containerID="e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.539563 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208"} err="failed to get container status \"e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208\": rpc error: code = NotFound desc = could not find container \"e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208\": container with ID starting with e693447426de6e70ba00d743cc29fe199dd650a8ed8bb1c55b29fde2621b0208 not found: ID does not exist" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.539575 4794 scope.go:117] "RemoveContainer" containerID="7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5" Dec 15 14:18:32 crc kubenswrapper[4794]: E1215 14:18:32.539838 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5\": container with ID starting with 7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5 not found: ID does not exist" containerID="7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.539859 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5"} err="failed to get container status \"7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5\": rpc error: code = NotFound desc = could not find container \"7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5\": container with ID starting with 7f98cfed899b0174b340b9bfac4fbd57e0ea31b1617ed7b1720399f9ff3c2ec5 not found: ID does not exist" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.539875 4794 scope.go:117] "RemoveContainer" containerID="eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c" Dec 15 14:18:32 crc kubenswrapper[4794]: E1215 14:18:32.540033 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c\": container with ID starting with eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c not found: ID does not exist" containerID="eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.540052 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c"} err="failed to get container status \"eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c\": rpc error: code = NotFound desc = could not find container \"eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c\": container with ID starting with eb047f6d2bdf639ffdbbaef1bb376124c0eabd6bb8923ceed6173ac041576f0c not found: ID does not exist" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.642290 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.649879 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.662305 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-combined-ca-bundle\") pod \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.662420 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-config-data\") pod \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.662524 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-public-tls-certs\") pod \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.662544 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-logs\") pod \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.662562 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-internal-tls-certs\") pod \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.663016 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-custom-prometheus-ca\") pod \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.663043 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rlld\" (UniqueName: \"kubernetes.io/projected/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-kube-api-access-8rlld\") pod \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\" (UID: \"4ef6b62b-a279-4abc-87fc-2a00610bb1aa\") " Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.663212 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-logs" (OuterVolumeSpecName: "logs") pod "4ef6b62b-a279-4abc-87fc-2a00610bb1aa" (UID: "4ef6b62b-a279-4abc-87fc-2a00610bb1aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.663397 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.664667 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:32 crc kubenswrapper[4794]: E1215 14:18:32.665077 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerName="watcher-api" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665099 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerName="watcher-api" Dec 15 14:18:32 crc kubenswrapper[4794]: E1215 14:18:32.665122 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="ceilometer-notification-agent" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665132 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="ceilometer-notification-agent" Dec 15 14:18:32 crc kubenswrapper[4794]: E1215 14:18:32.665145 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="sg-core" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665153 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="sg-core" Dec 15 14:18:32 crc kubenswrapper[4794]: E1215 14:18:32.665166 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerName="watcher-kuttl-api-log" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665176 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerName="watcher-kuttl-api-log" Dec 15 14:18:32 crc kubenswrapper[4794]: E1215 14:18:32.665195 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="ceilometer-central-agent" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665204 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="ceilometer-central-agent" Dec 15 14:18:32 crc kubenswrapper[4794]: E1215 14:18:32.665221 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="proxy-httpd" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665229 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="proxy-httpd" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665654 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="ceilometer-notification-agent" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665702 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerName="watcher-api" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665724 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="sg-core" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665740 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="ceilometer-central-agent" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665796 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" containerName="proxy-httpd" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.665812 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" containerName="watcher-kuttl-api-log" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.669408 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.676539 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.677222 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.679121 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-kube-api-access-8rlld" (OuterVolumeSpecName: "kube-api-access-8rlld") pod "4ef6b62b-a279-4abc-87fc-2a00610bb1aa" (UID: "4ef6b62b-a279-4abc-87fc-2a00610bb1aa"). InnerVolumeSpecName "kube-api-access-8rlld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.682500 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.694659 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ef6b62b-a279-4abc-87fc-2a00610bb1aa" (UID: "4ef6b62b-a279-4abc-87fc-2a00610bb1aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.695370 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "4ef6b62b-a279-4abc-87fc-2a00610bb1aa" (UID: "4ef6b62b-a279-4abc-87fc-2a00610bb1aa"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.706770 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.720658 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4ef6b62b-a279-4abc-87fc-2a00610bb1aa" (UID: "4ef6b62b-a279-4abc-87fc-2a00610bb1aa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.737077 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4ef6b62b-a279-4abc-87fc-2a00610bb1aa" (UID: "4ef6b62b-a279-4abc-87fc-2a00610bb1aa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.743043 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-config-data" (OuterVolumeSpecName: "config-data") pod "4ef6b62b-a279-4abc-87fc-2a00610bb1aa" (UID: "4ef6b62b-a279-4abc-87fc-2a00610bb1aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.748511 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c3a22d-52a8-43ff-ad37-238331ecd3e2" path="/var/lib/kubelet/pods/10c3a22d-52a8-43ff-ad37-238331ecd3e2/volumes" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.749247 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ebf7cc8-b80f-432f-b1ad-275742da85a3" path="/var/lib/kubelet/pods/5ebf7cc8-b80f-432f-b1ad-275742da85a3/volumes" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.749539 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb5ad5d-c780-4b1d-95c1-784ce8850044" path="/var/lib/kubelet/pods/8bb5ad5d-c780-4b1d-95c1-784ce8850044/volumes" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.750418 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea336596-4d0f-4ca5-982f-bcefbe49f545" path="/var/lib/kubelet/pods/ea336596-4d0f-4ca5-982f-bcefbe49f545/volumes" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.750897 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f790587d-a223-47c4-ba0d-79b0c2127f61" path="/var/lib/kubelet/pods/f790587d-a223-47c4-ba0d-79b0c2127f61/volumes" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764305 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-config-data\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764343 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764363 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-scripts\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764398 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764470 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-log-httpd\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764544 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-run-httpd\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764572 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmjng\" (UniqueName: \"kubernetes.io/projected/af2738b9-93b6-4db4-8fb3-47edbf0dd324-kube-api-access-mmjng\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764732 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764854 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764871 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rlld\" (UniqueName: \"kubernetes.io/projected/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-kube-api-access-8rlld\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764882 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764891 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764901 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.764910 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ef6b62b-a279-4abc-87fc-2a00610bb1aa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.865925 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.866671 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-config-data\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.866716 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.866744 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-scripts\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.866840 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.866947 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-log-httpd\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.868700 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-run-httpd\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.867350 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-log-httpd\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.868980 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-run-httpd\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.869538 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.868740 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmjng\" (UniqueName: \"kubernetes.io/projected/af2738b9-93b6-4db4-8fb3-47edbf0dd324-kube-api-access-mmjng\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.870291 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-scripts\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.871090 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.872113 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.872785 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-config-data\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:32 crc kubenswrapper[4794]: I1215 14:18:32.888311 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmjng\" (UniqueName: \"kubernetes.io/projected/af2738b9-93b6-4db4-8fb3-47edbf0dd324-kube-api-access-mmjng\") pod \"ceilometer-0\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.019053 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.376716 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4ef6b62b-a279-4abc-87fc-2a00610bb1aa","Type":"ContainerDied","Data":"98299c1f622969e94f20c825bb0b54c76a6dd0b2f1120b4f737e81f4dd03b511"} Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.377234 4794 scope.go:117] "RemoveContainer" containerID="e8dbeb4ccbd4cf96c6f0bb20b4291e3b02ac96f2c66e8244631b2b9a0c59eab6" Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.377090 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.416645 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.426504 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.442974 4794 scope.go:117] "RemoveContainer" containerID="0a05a95e51359e88fa3a77d764a79df663fbcbb23ee683c895ee14f623114f81" Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.488798 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:33 crc kubenswrapper[4794]: W1215 14:18:33.526818 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf2738b9_93b6_4db4_8fb3_47edbf0dd324.slice/crio-03ed63a44680e4d880074c7434d4e9822fb3006a0fccdea940325ecba2812697 WatchSource:0}: Error finding container 03ed63a44680e4d880074c7434d4e9822fb3006a0fccdea940325ecba2812697: Status 404 returned error can't find the container with id 03ed63a44680e4d880074c7434d4e9822fb3006a0fccdea940325ecba2812697 Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.565134 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.915233 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.989049 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-combined-ca-bundle\") pod \"1c2768c9-bf11-4346-ab55-5e9a95f65278\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.989159 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c2768c9-bf11-4346-ab55-5e9a95f65278-logs\") pod \"1c2768c9-bf11-4346-ab55-5e9a95f65278\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.989207 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-config-data\") pod \"1c2768c9-bf11-4346-ab55-5e9a95f65278\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.989306 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chv2z\" (UniqueName: \"kubernetes.io/projected/1c2768c9-bf11-4346-ab55-5e9a95f65278-kube-api-access-chv2z\") pod \"1c2768c9-bf11-4346-ab55-5e9a95f65278\" (UID: \"1c2768c9-bf11-4346-ab55-5e9a95f65278\") " Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.990047 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c2768c9-bf11-4346-ab55-5e9a95f65278-logs" (OuterVolumeSpecName: "logs") pod "1c2768c9-bf11-4346-ab55-5e9a95f65278" (UID: "1c2768c9-bf11-4346-ab55-5e9a95f65278"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:18:33 crc kubenswrapper[4794]: I1215 14:18:33.997696 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c2768c9-bf11-4346-ab55-5e9a95f65278-kube-api-access-chv2z" (OuterVolumeSpecName: "kube-api-access-chv2z") pod "1c2768c9-bf11-4346-ab55-5e9a95f65278" (UID: "1c2768c9-bf11-4346-ab55-5e9a95f65278"). InnerVolumeSpecName "kube-api-access-chv2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.015686 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c2768c9-bf11-4346-ab55-5e9a95f65278" (UID: "1c2768c9-bf11-4346-ab55-5e9a95f65278"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.067039 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-config-data" (OuterVolumeSpecName: "config-data") pod "1c2768c9-bf11-4346-ab55-5e9a95f65278" (UID: "1c2768c9-bf11-4346-ab55-5e9a95f65278"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.091337 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.091385 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c2768c9-bf11-4346-ab55-5e9a95f65278-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.091399 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2768c9-bf11-4346-ab55-5e9a95f65278-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.091411 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chv2z\" (UniqueName: \"kubernetes.io/projected/1c2768c9-bf11-4346-ab55-5e9a95f65278-kube-api-access-chv2z\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.402784 4794 generic.go:334] "Generic (PLEG): container finished" podID="1c2768c9-bf11-4346-ab55-5e9a95f65278" containerID="dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4" exitCode=0 Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.402858 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.402876 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c2768c9-bf11-4346-ab55-5e9a95f65278","Type":"ContainerDied","Data":"dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4"} Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.403968 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"1c2768c9-bf11-4346-ab55-5e9a95f65278","Type":"ContainerDied","Data":"8b83eee1d05ff8fe7e6801a8c412725889babe4017053dd7bd346fbcae1d9d28"} Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.403998 4794 scope.go:117] "RemoveContainer" containerID="dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.414100 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af2738b9-93b6-4db4-8fb3-47edbf0dd324","Type":"ContainerStarted","Data":"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8"} Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.414152 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af2738b9-93b6-4db4-8fb3-47edbf0dd324","Type":"ContainerStarted","Data":"03ed63a44680e4d880074c7434d4e9822fb3006a0fccdea940325ecba2812697"} Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.421016 4794 generic.go:334] "Generic (PLEG): container finished" podID="564983a5-b527-4907-ad6f-790afad9a87c" containerID="2cea1ba8b2811c243f8be19e131a68993662395f0db2cf5633b334462f695705" exitCode=0 Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.421058 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"564983a5-b527-4907-ad6f-790afad9a87c","Type":"ContainerDied","Data":"2cea1ba8b2811c243f8be19e131a68993662395f0db2cf5633b334462f695705"} Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.433865 4794 scope.go:117] "RemoveContainer" containerID="dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4" Dec 15 14:18:34 crc kubenswrapper[4794]: E1215 14:18:34.436688 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4\": container with ID starting with dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4 not found: ID does not exist" containerID="dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.436729 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4"} err="failed to get container status \"dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4\": rpc error: code = NotFound desc = could not find container \"dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4\": container with ID starting with dc657f7bc2cfbb1ad4d07623761b49ac8ae943e5ccb5f4cc05f65dbcc1191ff4 not found: ID does not exist" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.452545 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.460552 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.632786 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.709146 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-custom-prometheus-ca\") pod \"564983a5-b527-4907-ad6f-790afad9a87c\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.709250 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564983a5-b527-4907-ad6f-790afad9a87c-logs\") pod \"564983a5-b527-4907-ad6f-790afad9a87c\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.709288 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-combined-ca-bundle\") pod \"564983a5-b527-4907-ad6f-790afad9a87c\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.709339 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-config-data\") pod \"564983a5-b527-4907-ad6f-790afad9a87c\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.709450 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx897\" (UniqueName: \"kubernetes.io/projected/564983a5-b527-4907-ad6f-790afad9a87c-kube-api-access-sx897\") pod \"564983a5-b527-4907-ad6f-790afad9a87c\" (UID: \"564983a5-b527-4907-ad6f-790afad9a87c\") " Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.709517 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/564983a5-b527-4907-ad6f-790afad9a87c-logs" (OuterVolumeSpecName: "logs") pod "564983a5-b527-4907-ad6f-790afad9a87c" (UID: "564983a5-b527-4907-ad6f-790afad9a87c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.709745 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/564983a5-b527-4907-ad6f-790afad9a87c-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.727807 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564983a5-b527-4907-ad6f-790afad9a87c-kube-api-access-sx897" (OuterVolumeSpecName: "kube-api-access-sx897") pod "564983a5-b527-4907-ad6f-790afad9a87c" (UID: "564983a5-b527-4907-ad6f-790afad9a87c"). InnerVolumeSpecName "kube-api-access-sx897". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.750415 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "564983a5-b527-4907-ad6f-790afad9a87c" (UID: "564983a5-b527-4907-ad6f-790afad9a87c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.750971 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c2768c9-bf11-4346-ab55-5e9a95f65278" path="/var/lib/kubelet/pods/1c2768c9-bf11-4346-ab55-5e9a95f65278/volumes" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.751511 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef6b62b-a279-4abc-87fc-2a00610bb1aa" path="/var/lib/kubelet/pods/4ef6b62b-a279-4abc-87fc-2a00610bb1aa/volumes" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.768809 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-config-data" (OuterVolumeSpecName: "config-data") pod "564983a5-b527-4907-ad6f-790afad9a87c" (UID: "564983a5-b527-4907-ad6f-790afad9a87c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.800045 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "564983a5-b527-4907-ad6f-790afad9a87c" (UID: "564983a5-b527-4907-ad6f-790afad9a87c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.812138 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.812173 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx897\" (UniqueName: \"kubernetes.io/projected/564983a5-b527-4907-ad6f-790afad9a87c-kube-api-access-sx897\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.812185 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:34 crc kubenswrapper[4794]: I1215 14:18:34.812197 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564983a5-b527-4907-ad6f-790afad9a87c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:35 crc kubenswrapper[4794]: I1215 14:18:35.440246 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af2738b9-93b6-4db4-8fb3-47edbf0dd324","Type":"ContainerStarted","Data":"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904"} Dec 15 14:18:35 crc kubenswrapper[4794]: I1215 14:18:35.442541 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"564983a5-b527-4907-ad6f-790afad9a87c","Type":"ContainerDied","Data":"873ddb46f488e1b4a68245ba73f91ccb243024b22f6262ef5453296a5441a43d"} Dec 15 14:18:35 crc kubenswrapper[4794]: I1215 14:18:35.442681 4794 scope.go:117] "RemoveContainer" containerID="2cea1ba8b2811c243f8be19e131a68993662395f0db2cf5633b334462f695705" Dec 15 14:18:35 crc kubenswrapper[4794]: I1215 14:18:35.442815 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:35 crc kubenswrapper[4794]: I1215 14:18:35.554072 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:18:35 crc kubenswrapper[4794]: I1215 14:18:35.561352 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.030417 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-jxgjt"] Dec 15 14:18:36 crc kubenswrapper[4794]: E1215 14:18:36.043470 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c2768c9-bf11-4346-ab55-5e9a95f65278" containerName="watcher-applier" Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.043513 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2768c9-bf11-4346-ab55-5e9a95f65278" containerName="watcher-applier" Dec 15 14:18:36 crc kubenswrapper[4794]: E1215 14:18:36.043552 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564983a5-b527-4907-ad6f-790afad9a87c" containerName="watcher-decision-engine" Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.043561 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="564983a5-b527-4907-ad6f-790afad9a87c" containerName="watcher-decision-engine" Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.043775 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c2768c9-bf11-4346-ab55-5e9a95f65278" containerName="watcher-applier" Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.043844 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="564983a5-b527-4907-ad6f-790afad9a87c" containerName="watcher-decision-engine" Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.044464 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-jxgjt"] Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.045156 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-jxgjt" Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.136178 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smt6v\" (UniqueName: \"kubernetes.io/projected/7d268b75-5ae2-46f1-9695-218590c87682-kube-api-access-smt6v\") pod \"watcher-db-create-jxgjt\" (UID: \"7d268b75-5ae2-46f1-9695-218590c87682\") " pod="watcher-kuttl-default/watcher-db-create-jxgjt" Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.237690 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smt6v\" (UniqueName: \"kubernetes.io/projected/7d268b75-5ae2-46f1-9695-218590c87682-kube-api-access-smt6v\") pod \"watcher-db-create-jxgjt\" (UID: \"7d268b75-5ae2-46f1-9695-218590c87682\") " pod="watcher-kuttl-default/watcher-db-create-jxgjt" Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.261480 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smt6v\" (UniqueName: \"kubernetes.io/projected/7d268b75-5ae2-46f1-9695-218590c87682-kube-api-access-smt6v\") pod \"watcher-db-create-jxgjt\" (UID: \"7d268b75-5ae2-46f1-9695-218590c87682\") " pod="watcher-kuttl-default/watcher-db-create-jxgjt" Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.391684 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-jxgjt" Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.494137 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af2738b9-93b6-4db4-8fb3-47edbf0dd324","Type":"ContainerStarted","Data":"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735"} Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.746608 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564983a5-b527-4907-ad6f-790afad9a87c" path="/var/lib/kubelet/pods/564983a5-b527-4907-ad6f-790afad9a87c/volumes" Dec 15 14:18:36 crc kubenswrapper[4794]: I1215 14:18:36.880930 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-jxgjt"] Dec 15 14:18:37 crc kubenswrapper[4794]: I1215 14:18:37.513724 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af2738b9-93b6-4db4-8fb3-47edbf0dd324","Type":"ContainerStarted","Data":"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb"} Dec 15 14:18:37 crc kubenswrapper[4794]: I1215 14:18:37.515019 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="ceilometer-central-agent" containerID="cri-o://be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8" gracePeriod=30 Dec 15 14:18:37 crc kubenswrapper[4794]: I1215 14:18:37.515497 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:37 crc kubenswrapper[4794]: I1215 14:18:37.515888 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="proxy-httpd" containerID="cri-o://9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb" gracePeriod=30 Dec 15 14:18:37 crc kubenswrapper[4794]: I1215 14:18:37.516038 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="sg-core" containerID="cri-o://f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735" gracePeriod=30 Dec 15 14:18:37 crc kubenswrapper[4794]: I1215 14:18:37.516191 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="ceilometer-notification-agent" containerID="cri-o://43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904" gracePeriod=30 Dec 15 14:18:37 crc kubenswrapper[4794]: I1215 14:18:37.520777 4794 generic.go:334] "Generic (PLEG): container finished" podID="7d268b75-5ae2-46f1-9695-218590c87682" containerID="b44791a5d9cbe4958c70ca8d1b8848ee4c6900899a1fe0369fc383a03fbd3ef9" exitCode=0 Dec 15 14:18:37 crc kubenswrapper[4794]: I1215 14:18:37.520848 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-jxgjt" event={"ID":"7d268b75-5ae2-46f1-9695-218590c87682","Type":"ContainerDied","Data":"b44791a5d9cbe4958c70ca8d1b8848ee4c6900899a1fe0369fc383a03fbd3ef9"} Dec 15 14:18:37 crc kubenswrapper[4794]: I1215 14:18:37.520877 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-jxgjt" event={"ID":"7d268b75-5ae2-46f1-9695-218590c87682","Type":"ContainerStarted","Data":"e5af6edb7e1ddccd461e9ee86526025c07460c135ce7f93921aae140d557795e"} Dec 15 14:18:37 crc kubenswrapper[4794]: I1215 14:18:37.565297 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.141374821 podStartE2EDuration="5.56527708s" podCreationTimestamp="2025-12-15 14:18:32 +0000 UTC" firstStartedPulling="2025-12-15 14:18:33.528410701 +0000 UTC m=+1475.380433139" lastFinishedPulling="2025-12-15 14:18:36.95231296 +0000 UTC m=+1478.804335398" observedRunningTime="2025-12-15 14:18:37.56139186 +0000 UTC m=+1479.413414308" watchObservedRunningTime="2025-12-15 14:18:37.56527708 +0000 UTC m=+1479.417299528" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.286155 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.475429 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-log-httpd\") pod \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.475769 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-ceilometer-tls-certs\") pod \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.475926 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmjng\" (UniqueName: \"kubernetes.io/projected/af2738b9-93b6-4db4-8fb3-47edbf0dd324-kube-api-access-mmjng\") pod \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.476040 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-scripts\") pod \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.475962 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "af2738b9-93b6-4db4-8fb3-47edbf0dd324" (UID: "af2738b9-93b6-4db4-8fb3-47edbf0dd324"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.476158 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-sg-core-conf-yaml\") pod \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.476277 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-config-data\") pod \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.476645 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-run-httpd\") pod \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.476652 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "af2738b9-93b6-4db4-8fb3-47edbf0dd324" (UID: "af2738b9-93b6-4db4-8fb3-47edbf0dd324"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.476696 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-combined-ca-bundle\") pod \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\" (UID: \"af2738b9-93b6-4db4-8fb3-47edbf0dd324\") " Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.476992 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.477013 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af2738b9-93b6-4db4-8fb3-47edbf0dd324-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.481042 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2738b9-93b6-4db4-8fb3-47edbf0dd324-kube-api-access-mmjng" (OuterVolumeSpecName: "kube-api-access-mmjng") pod "af2738b9-93b6-4db4-8fb3-47edbf0dd324" (UID: "af2738b9-93b6-4db4-8fb3-47edbf0dd324"). InnerVolumeSpecName "kube-api-access-mmjng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.483754 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-scripts" (OuterVolumeSpecName: "scripts") pod "af2738b9-93b6-4db4-8fb3-47edbf0dd324" (UID: "af2738b9-93b6-4db4-8fb3-47edbf0dd324"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.509412 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "af2738b9-93b6-4db4-8fb3-47edbf0dd324" (UID: "af2738b9-93b6-4db4-8fb3-47edbf0dd324"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.525465 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "af2738b9-93b6-4db4-8fb3-47edbf0dd324" (UID: "af2738b9-93b6-4db4-8fb3-47edbf0dd324"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.540261 4794 generic.go:334] "Generic (PLEG): container finished" podID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerID="9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb" exitCode=0 Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.540507 4794 generic.go:334] "Generic (PLEG): container finished" podID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerID="f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735" exitCode=2 Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.540568 4794 generic.go:334] "Generic (PLEG): container finished" podID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerID="43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904" exitCode=0 Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.540686 4794 generic.go:334] "Generic (PLEG): container finished" podID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerID="be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8" exitCode=0 Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.540474 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af2738b9-93b6-4db4-8fb3-47edbf0dd324","Type":"ContainerDied","Data":"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb"} Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.541045 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af2738b9-93b6-4db4-8fb3-47edbf0dd324","Type":"ContainerDied","Data":"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735"} Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.541109 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af2738b9-93b6-4db4-8fb3-47edbf0dd324","Type":"ContainerDied","Data":"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904"} Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.541174 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af2738b9-93b6-4db4-8fb3-47edbf0dd324","Type":"ContainerDied","Data":"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8"} Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.541231 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"af2738b9-93b6-4db4-8fb3-47edbf0dd324","Type":"ContainerDied","Data":"03ed63a44680e4d880074c7434d4e9822fb3006a0fccdea940325ecba2812697"} Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.541295 4794 scope.go:117] "RemoveContainer" containerID="9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.540594 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.551724 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af2738b9-93b6-4db4-8fb3-47edbf0dd324" (UID: "af2738b9-93b6-4db4-8fb3-47edbf0dd324"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.578486 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.578527 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.578540 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.578551 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.578563 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmjng\" (UniqueName: \"kubernetes.io/projected/af2738b9-93b6-4db4-8fb3-47edbf0dd324-kube-api-access-mmjng\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.584816 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-config-data" (OuterVolumeSpecName: "config-data") pod "af2738b9-93b6-4db4-8fb3-47edbf0dd324" (UID: "af2738b9-93b6-4db4-8fb3-47edbf0dd324"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.662835 4794 scope.go:117] "RemoveContainer" containerID="f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.679614 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2738b9-93b6-4db4-8fb3-47edbf0dd324-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.681794 4794 scope.go:117] "RemoveContainer" containerID="43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.700431 4794 scope.go:117] "RemoveContainer" containerID="be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.719734 4794 scope.go:117] "RemoveContainer" containerID="9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb" Dec 15 14:18:38 crc kubenswrapper[4794]: E1215 14:18:38.723110 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb\": container with ID starting with 9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb not found: ID does not exist" containerID="9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.723163 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb"} err="failed to get container status \"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb\": rpc error: code = NotFound desc = could not find container \"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb\": container with ID starting with 9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.723203 4794 scope.go:117] "RemoveContainer" containerID="f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735" Dec 15 14:18:38 crc kubenswrapper[4794]: E1215 14:18:38.723733 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735\": container with ID starting with f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735 not found: ID does not exist" containerID="f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.723767 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735"} err="failed to get container status \"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735\": rpc error: code = NotFound desc = could not find container \"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735\": container with ID starting with f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.723792 4794 scope.go:117] "RemoveContainer" containerID="43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904" Dec 15 14:18:38 crc kubenswrapper[4794]: E1215 14:18:38.724337 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904\": container with ID starting with 43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904 not found: ID does not exist" containerID="43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.724375 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904"} err="failed to get container status \"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904\": rpc error: code = NotFound desc = could not find container \"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904\": container with ID starting with 43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.724397 4794 scope.go:117] "RemoveContainer" containerID="be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8" Dec 15 14:18:38 crc kubenswrapper[4794]: E1215 14:18:38.724676 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8\": container with ID starting with be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8 not found: ID does not exist" containerID="be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.724703 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8"} err="failed to get container status \"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8\": rpc error: code = NotFound desc = could not find container \"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8\": container with ID starting with be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.724721 4794 scope.go:117] "RemoveContainer" containerID="9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.725160 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb"} err="failed to get container status \"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb\": rpc error: code = NotFound desc = could not find container \"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb\": container with ID starting with 9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.725278 4794 scope.go:117] "RemoveContainer" containerID="f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.726002 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735"} err="failed to get container status \"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735\": rpc error: code = NotFound desc = could not find container \"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735\": container with ID starting with f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.726189 4794 scope.go:117] "RemoveContainer" containerID="43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.726668 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904"} err="failed to get container status \"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904\": rpc error: code = NotFound desc = could not find container \"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904\": container with ID starting with 43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.726714 4794 scope.go:117] "RemoveContainer" containerID="be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.727081 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8"} err="failed to get container status \"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8\": rpc error: code = NotFound desc = could not find container \"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8\": container with ID starting with be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.727179 4794 scope.go:117] "RemoveContainer" containerID="9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.727575 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb"} err="failed to get container status \"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb\": rpc error: code = NotFound desc = could not find container \"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb\": container with ID starting with 9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.727612 4794 scope.go:117] "RemoveContainer" containerID="f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.728547 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735"} err="failed to get container status \"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735\": rpc error: code = NotFound desc = could not find container \"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735\": container with ID starting with f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.728662 4794 scope.go:117] "RemoveContainer" containerID="43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.728978 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904"} err="failed to get container status \"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904\": rpc error: code = NotFound desc = could not find container \"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904\": container with ID starting with 43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.729000 4794 scope.go:117] "RemoveContainer" containerID="be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.729341 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8"} err="failed to get container status \"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8\": rpc error: code = NotFound desc = could not find container \"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8\": container with ID starting with be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.729377 4794 scope.go:117] "RemoveContainer" containerID="9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.729669 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb"} err="failed to get container status \"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb\": rpc error: code = NotFound desc = could not find container \"9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb\": container with ID starting with 9aa6c29d71737a09142c6db69fd3761bad15c9681878dd1d3a9e4a55c27f8adb not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.729772 4794 scope.go:117] "RemoveContainer" containerID="f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.730092 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735"} err="failed to get container status \"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735\": rpc error: code = NotFound desc = could not find container \"f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735\": container with ID starting with f0632d7f248edc353bb24f8b168e4dfb2c54a1a6453df448ff0b12ff6c572735 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.730120 4794 scope.go:117] "RemoveContainer" containerID="43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.730323 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904"} err="failed to get container status \"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904\": rpc error: code = NotFound desc = could not find container \"43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904\": container with ID starting with 43bd07fe0340ed09b0b12d84f7954d664fe6a27ae4fe6b01e1ca116e87053904 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.730342 4794 scope.go:117] "RemoveContainer" containerID="be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.730681 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8"} err="failed to get container status \"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8\": rpc error: code = NotFound desc = could not find container \"be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8\": container with ID starting with be5b61b9cab98af1b7e2fb5789d282e4373a3ee97b71d73a912a9eda5cd550d8 not found: ID does not exist" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.835427 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-jxgjt" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.872875 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.880675 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.884174 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smt6v\" (UniqueName: \"kubernetes.io/projected/7d268b75-5ae2-46f1-9695-218590c87682-kube-api-access-smt6v\") pod \"7d268b75-5ae2-46f1-9695-218590c87682\" (UID: \"7d268b75-5ae2-46f1-9695-218590c87682\") " Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.892735 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d268b75-5ae2-46f1-9695-218590c87682-kube-api-access-smt6v" (OuterVolumeSpecName: "kube-api-access-smt6v") pod "7d268b75-5ae2-46f1-9695-218590c87682" (UID: "7d268b75-5ae2-46f1-9695-218590c87682"). InnerVolumeSpecName "kube-api-access-smt6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.905477 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:38 crc kubenswrapper[4794]: E1215 14:18:38.905937 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="ceilometer-notification-agent" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.905951 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="ceilometer-notification-agent" Dec 15 14:18:38 crc kubenswrapper[4794]: E1215 14:18:38.905966 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="sg-core" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.905977 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="sg-core" Dec 15 14:18:38 crc kubenswrapper[4794]: E1215 14:18:38.905986 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="ceilometer-central-agent" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.905992 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="ceilometer-central-agent" Dec 15 14:18:38 crc kubenswrapper[4794]: E1215 14:18:38.905998 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d268b75-5ae2-46f1-9695-218590c87682" containerName="mariadb-database-create" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.906005 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d268b75-5ae2-46f1-9695-218590c87682" containerName="mariadb-database-create" Dec 15 14:18:38 crc kubenswrapper[4794]: E1215 14:18:38.906017 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="proxy-httpd" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.906023 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="proxy-httpd" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.906174 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="proxy-httpd" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.906184 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d268b75-5ae2-46f1-9695-218590c87682" containerName="mariadb-database-create" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.906219 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="ceilometer-central-agent" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.906233 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="sg-core" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.906248 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" containerName="ceilometer-notification-agent" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.908934 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.912919 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.913391 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.914445 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.914539 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.986041 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-run-httpd\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.986086 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-scripts\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.986126 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzdjz\" (UniqueName: \"kubernetes.io/projected/d1adb70b-8c07-4780-aab2-e793f3f85f63-kube-api-access-wzdjz\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.986241 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-config-data\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.986283 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.986359 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-log-httpd\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.986396 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.986419 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:38 crc kubenswrapper[4794]: I1215 14:18:38.986490 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smt6v\" (UniqueName: \"kubernetes.io/projected/7d268b75-5ae2-46f1-9695-218590c87682-kube-api-access-smt6v\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.087406 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.087496 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-run-httpd\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.087523 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-scripts\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.087603 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzdjz\" (UniqueName: \"kubernetes.io/projected/d1adb70b-8c07-4780-aab2-e793f3f85f63-kube-api-access-wzdjz\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.087672 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-config-data\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.087697 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.087729 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-log-httpd\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.087768 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.089634 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-run-httpd\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.089684 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-log-httpd\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.091904 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.092900 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-config-data\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.092948 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.093522 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.098480 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-scripts\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.109172 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzdjz\" (UniqueName: \"kubernetes.io/projected/d1adb70b-8c07-4780-aab2-e793f3f85f63-kube-api-access-wzdjz\") pod \"ceilometer-0\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.224533 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.569239 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-jxgjt" event={"ID":"7d268b75-5ae2-46f1-9695-218590c87682","Type":"ContainerDied","Data":"e5af6edb7e1ddccd461e9ee86526025c07460c135ce7f93921aae140d557795e"} Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.569538 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5af6edb7e1ddccd461e9ee86526025c07460c135ce7f93921aae140d557795e" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.569332 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-jxgjt" Dec 15 14:18:39 crc kubenswrapper[4794]: I1215 14:18:39.722362 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:18:40 crc kubenswrapper[4794]: I1215 14:18:40.581830 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d1adb70b-8c07-4780-aab2-e793f3f85f63","Type":"ContainerStarted","Data":"52581c3642cbc3f3c52904462191a10d4887efb1ee2ad86c6b08982b07fe77f3"} Dec 15 14:18:40 crc kubenswrapper[4794]: I1215 14:18:40.746883 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af2738b9-93b6-4db4-8fb3-47edbf0dd324" path="/var/lib/kubelet/pods/af2738b9-93b6-4db4-8fb3-47edbf0dd324/volumes" Dec 15 14:18:41 crc kubenswrapper[4794]: I1215 14:18:41.589424 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d1adb70b-8c07-4780-aab2-e793f3f85f63","Type":"ContainerStarted","Data":"8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c"} Dec 15 14:18:41 crc kubenswrapper[4794]: I1215 14:18:41.589688 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d1adb70b-8c07-4780-aab2-e793f3f85f63","Type":"ContainerStarted","Data":"2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c"} Dec 15 14:18:42 crc kubenswrapper[4794]: I1215 14:18:42.602453 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d1adb70b-8c07-4780-aab2-e793f3f85f63","Type":"ContainerStarted","Data":"3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85"} Dec 15 14:18:45 crc kubenswrapper[4794]: I1215 14:18:45.638768 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d1adb70b-8c07-4780-aab2-e793f3f85f63","Type":"ContainerStarted","Data":"dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970"} Dec 15 14:18:45 crc kubenswrapper[4794]: I1215 14:18:45.639722 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:18:45 crc kubenswrapper[4794]: I1215 14:18:45.674011 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=3.007629651 podStartE2EDuration="7.673982286s" podCreationTimestamp="2025-12-15 14:18:38 +0000 UTC" firstStartedPulling="2025-12-15 14:18:39.721208061 +0000 UTC m=+1481.573230519" lastFinishedPulling="2025-12-15 14:18:44.387560716 +0000 UTC m=+1486.239583154" observedRunningTime="2025-12-15 14:18:45.661242326 +0000 UTC m=+1487.513264794" watchObservedRunningTime="2025-12-15 14:18:45.673982286 +0000 UTC m=+1487.526004754" Dec 15 14:18:46 crc kubenswrapper[4794]: I1215 14:18:46.265471 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-cad7-account-create-rcmsr"] Dec 15 14:18:46 crc kubenswrapper[4794]: I1215 14:18:46.266534 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-cad7-account-create-rcmsr" Dec 15 14:18:46 crc kubenswrapper[4794]: I1215 14:18:46.269161 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 15 14:18:46 crc kubenswrapper[4794]: I1215 14:18:46.290942 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-cad7-account-create-rcmsr"] Dec 15 14:18:46 crc kubenswrapper[4794]: I1215 14:18:46.421262 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5kx8\" (UniqueName: \"kubernetes.io/projected/985fe1f8-1958-4639-8518-80b5d9b321db-kube-api-access-s5kx8\") pod \"watcher-cad7-account-create-rcmsr\" (UID: \"985fe1f8-1958-4639-8518-80b5d9b321db\") " pod="watcher-kuttl-default/watcher-cad7-account-create-rcmsr" Dec 15 14:18:46 crc kubenswrapper[4794]: I1215 14:18:46.522841 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5kx8\" (UniqueName: \"kubernetes.io/projected/985fe1f8-1958-4639-8518-80b5d9b321db-kube-api-access-s5kx8\") pod \"watcher-cad7-account-create-rcmsr\" (UID: \"985fe1f8-1958-4639-8518-80b5d9b321db\") " pod="watcher-kuttl-default/watcher-cad7-account-create-rcmsr" Dec 15 14:18:46 crc kubenswrapper[4794]: I1215 14:18:46.546216 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5kx8\" (UniqueName: \"kubernetes.io/projected/985fe1f8-1958-4639-8518-80b5d9b321db-kube-api-access-s5kx8\") pod \"watcher-cad7-account-create-rcmsr\" (UID: \"985fe1f8-1958-4639-8518-80b5d9b321db\") " pod="watcher-kuttl-default/watcher-cad7-account-create-rcmsr" Dec 15 14:18:46 crc kubenswrapper[4794]: I1215 14:18:46.588338 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-cad7-account-create-rcmsr" Dec 15 14:18:47 crc kubenswrapper[4794]: I1215 14:18:47.115001 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-cad7-account-create-rcmsr"] Dec 15 14:18:47 crc kubenswrapper[4794]: W1215 14:18:47.118100 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod985fe1f8_1958_4639_8518_80b5d9b321db.slice/crio-962b3a0d3e6b48dd334e53f6055460db0333bb8c229fc6b94abe9f54a7b7eb30 WatchSource:0}: Error finding container 962b3a0d3e6b48dd334e53f6055460db0333bb8c229fc6b94abe9f54a7b7eb30: Status 404 returned error can't find the container with id 962b3a0d3e6b48dd334e53f6055460db0333bb8c229fc6b94abe9f54a7b7eb30 Dec 15 14:18:47 crc kubenswrapper[4794]: I1215 14:18:47.673280 4794 generic.go:334] "Generic (PLEG): container finished" podID="985fe1f8-1958-4639-8518-80b5d9b321db" containerID="74627950df29d946c3be1a0887e6d8c4747f339f8a5f5c1a0947c460fc983af9" exitCode=0 Dec 15 14:18:47 crc kubenswrapper[4794]: I1215 14:18:47.673328 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-cad7-account-create-rcmsr" event={"ID":"985fe1f8-1958-4639-8518-80b5d9b321db","Type":"ContainerDied","Data":"74627950df29d946c3be1a0887e6d8c4747f339f8a5f5c1a0947c460fc983af9"} Dec 15 14:18:47 crc kubenswrapper[4794]: I1215 14:18:47.673356 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-cad7-account-create-rcmsr" event={"ID":"985fe1f8-1958-4639-8518-80b5d9b321db","Type":"ContainerStarted","Data":"962b3a0d3e6b48dd334e53f6055460db0333bb8c229fc6b94abe9f54a7b7eb30"} Dec 15 14:18:49 crc kubenswrapper[4794]: I1215 14:18:49.138458 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-cad7-account-create-rcmsr" Dec 15 14:18:49 crc kubenswrapper[4794]: I1215 14:18:49.197268 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5kx8\" (UniqueName: \"kubernetes.io/projected/985fe1f8-1958-4639-8518-80b5d9b321db-kube-api-access-s5kx8\") pod \"985fe1f8-1958-4639-8518-80b5d9b321db\" (UID: \"985fe1f8-1958-4639-8518-80b5d9b321db\") " Dec 15 14:18:49 crc kubenswrapper[4794]: I1215 14:18:49.220613 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/985fe1f8-1958-4639-8518-80b5d9b321db-kube-api-access-s5kx8" (OuterVolumeSpecName: "kube-api-access-s5kx8") pod "985fe1f8-1958-4639-8518-80b5d9b321db" (UID: "985fe1f8-1958-4639-8518-80b5d9b321db"). InnerVolumeSpecName "kube-api-access-s5kx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:49 crc kubenswrapper[4794]: I1215 14:18:49.299434 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5kx8\" (UniqueName: \"kubernetes.io/projected/985fe1f8-1958-4639-8518-80b5d9b321db-kube-api-access-s5kx8\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:49 crc kubenswrapper[4794]: I1215 14:18:49.698355 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-cad7-account-create-rcmsr" event={"ID":"985fe1f8-1958-4639-8518-80b5d9b321db","Type":"ContainerDied","Data":"962b3a0d3e6b48dd334e53f6055460db0333bb8c229fc6b94abe9f54a7b7eb30"} Dec 15 14:18:49 crc kubenswrapper[4794]: I1215 14:18:49.698648 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="962b3a0d3e6b48dd334e53f6055460db0333bb8c229fc6b94abe9f54a7b7eb30" Dec 15 14:18:49 crc kubenswrapper[4794]: I1215 14:18:49.698450 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-cad7-account-create-rcmsr" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.591831 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm"] Dec 15 14:18:51 crc kubenswrapper[4794]: E1215 14:18:51.592286 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985fe1f8-1958-4639-8518-80b5d9b321db" containerName="mariadb-account-create" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.592307 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="985fe1f8-1958-4639-8518-80b5d9b321db" containerName="mariadb-account-create" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.592620 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="985fe1f8-1958-4639-8518-80b5d9b321db" containerName="mariadb-account-create" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.593439 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.598020 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-9pwzp" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.598488 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm"] Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.599520 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.747496 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.747898 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.747933 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-config-data\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.747962 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6jjm\" (UniqueName: \"kubernetes.io/projected/608e51f6-df42-4462-b311-16f2c33e218f-kube-api-access-x6jjm\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.850157 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.850324 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.850413 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-config-data\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.850464 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6jjm\" (UniqueName: \"kubernetes.io/projected/608e51f6-df42-4462-b311-16f2c33e218f-kube-api-access-x6jjm\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.858364 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-config-data\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.860379 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.872070 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.892423 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6jjm\" (UniqueName: \"kubernetes.io/projected/608e51f6-df42-4462-b311-16f2c33e218f-kube-api-access-x6jjm\") pod \"watcher-kuttl-db-sync-lp9wm\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:51 crc kubenswrapper[4794]: I1215 14:18:51.920329 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:52 crc kubenswrapper[4794]: I1215 14:18:52.473025 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm"] Dec 15 14:18:52 crc kubenswrapper[4794]: W1215 14:18:52.478319 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod608e51f6_df42_4462_b311_16f2c33e218f.slice/crio-bf56e6574bb4d03e21182e0df534dfe82d9bb7a32e545ff52d05747596f72397 WatchSource:0}: Error finding container bf56e6574bb4d03e21182e0df534dfe82d9bb7a32e545ff52d05747596f72397: Status 404 returned error can't find the container with id bf56e6574bb4d03e21182e0df534dfe82d9bb7a32e545ff52d05747596f72397 Dec 15 14:18:52 crc kubenswrapper[4794]: I1215 14:18:52.720689 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" event={"ID":"608e51f6-df42-4462-b311-16f2c33e218f","Type":"ContainerStarted","Data":"2bf9b6dd86b8f0e7a17bbe02883181ece0e03d94815275a0f416efeb963f6f8f"} Dec 15 14:18:52 crc kubenswrapper[4794]: I1215 14:18:52.721014 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" event={"ID":"608e51f6-df42-4462-b311-16f2c33e218f","Type":"ContainerStarted","Data":"bf56e6574bb4d03e21182e0df534dfe82d9bb7a32e545ff52d05747596f72397"} Dec 15 14:18:52 crc kubenswrapper[4794]: I1215 14:18:52.734817 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" podStartSLOduration=1.7347961459999999 podStartE2EDuration="1.734796146s" podCreationTimestamp="2025-12-15 14:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:18:52.733378966 +0000 UTC m=+1494.585401424" watchObservedRunningTime="2025-12-15 14:18:52.734796146 +0000 UTC m=+1494.586818594" Dec 15 14:18:54 crc kubenswrapper[4794]: I1215 14:18:54.534470 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:18:54 crc kubenswrapper[4794]: I1215 14:18:54.534852 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:18:55 crc kubenswrapper[4794]: I1215 14:18:55.751268 4794 generic.go:334] "Generic (PLEG): container finished" podID="608e51f6-df42-4462-b311-16f2c33e218f" containerID="2bf9b6dd86b8f0e7a17bbe02883181ece0e03d94815275a0f416efeb963f6f8f" exitCode=0 Dec 15 14:18:55 crc kubenswrapper[4794]: I1215 14:18:55.751744 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" event={"ID":"608e51f6-df42-4462-b311-16f2c33e218f","Type":"ContainerDied","Data":"2bf9b6dd86b8f0e7a17bbe02883181ece0e03d94815275a0f416efeb963f6f8f"} Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.142103 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.336838 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6jjm\" (UniqueName: \"kubernetes.io/projected/608e51f6-df42-4462-b311-16f2c33e218f-kube-api-access-x6jjm\") pod \"608e51f6-df42-4462-b311-16f2c33e218f\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.336933 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-config-data\") pod \"608e51f6-df42-4462-b311-16f2c33e218f\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.337052 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-combined-ca-bundle\") pod \"608e51f6-df42-4462-b311-16f2c33e218f\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.337158 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-db-sync-config-data\") pod \"608e51f6-df42-4462-b311-16f2c33e218f\" (UID: \"608e51f6-df42-4462-b311-16f2c33e218f\") " Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.343825 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/608e51f6-df42-4462-b311-16f2c33e218f-kube-api-access-x6jjm" (OuterVolumeSpecName: "kube-api-access-x6jjm") pod "608e51f6-df42-4462-b311-16f2c33e218f" (UID: "608e51f6-df42-4462-b311-16f2c33e218f"). InnerVolumeSpecName "kube-api-access-x6jjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.351712 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "608e51f6-df42-4462-b311-16f2c33e218f" (UID: "608e51f6-df42-4462-b311-16f2c33e218f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.360291 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "608e51f6-df42-4462-b311-16f2c33e218f" (UID: "608e51f6-df42-4462-b311-16f2c33e218f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.387336 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-config-data" (OuterVolumeSpecName: "config-data") pod "608e51f6-df42-4462-b311-16f2c33e218f" (UID: "608e51f6-df42-4462-b311-16f2c33e218f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.438414 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.438446 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6jjm\" (UniqueName: \"kubernetes.io/projected/608e51f6-df42-4462-b311-16f2c33e218f-kube-api-access-x6jjm\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.438461 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.438471 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/608e51f6-df42-4462-b311-16f2c33e218f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.772657 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" event={"ID":"608e51f6-df42-4462-b311-16f2c33e218f","Type":"ContainerDied","Data":"bf56e6574bb4d03e21182e0df534dfe82d9bb7a32e545ff52d05747596f72397"} Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.772929 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf56e6574bb4d03e21182e0df534dfe82d9bb7a32e545ff52d05747596f72397" Dec 15 14:18:57 crc kubenswrapper[4794]: I1215 14:18:57.772693 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.104767 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:18:58 crc kubenswrapper[4794]: E1215 14:18:58.105197 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="608e51f6-df42-4462-b311-16f2c33e218f" containerName="watcher-kuttl-db-sync" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.105220 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="608e51f6-df42-4462-b311-16f2c33e218f" containerName="watcher-kuttl-db-sync" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.105423 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="608e51f6-df42-4462-b311-16f2c33e218f" containerName="watcher-kuttl-db-sync" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.106200 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.108092 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.110900 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-9pwzp" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.118620 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.146014 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.147732 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.152699 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.152743 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.153043 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.161831 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.210129 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.211795 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.213720 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.223379 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.251791 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.251861 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.251900 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.251938 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.251957 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d249f723-309a-44c0-a674-cf1ae0531ba2-logs\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.251977 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be758f6-763d-498b-8145-717da7e9c490-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.252003 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.252023 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.252052 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxc5r\" (UniqueName: \"kubernetes.io/projected/d249f723-309a-44c0-a674-cf1ae0531ba2-kube-api-access-nxc5r\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.252076 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wws6r\" (UniqueName: \"kubernetes.io/projected/0be758f6-763d-498b-8145-717da7e9c490-kube-api-access-wws6r\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.252091 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.252117 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353366 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe0138b-48f1-4462-b7bc-45ec9dad33f7-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353419 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353442 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353476 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353508 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353525 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d249f723-309a-44c0-a674-cf1ae0531ba2-logs\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353539 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be758f6-763d-498b-8145-717da7e9c490-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353567 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353623 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353643 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxc5r\" (UniqueName: \"kubernetes.io/projected/d249f723-309a-44c0-a674-cf1ae0531ba2-kube-api-access-nxc5r\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353660 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wws6r\" (UniqueName: \"kubernetes.io/projected/0be758f6-763d-498b-8145-717da7e9c490-kube-api-access-wws6r\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353675 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353692 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg6k7\" (UniqueName: \"kubernetes.io/projected/afe0138b-48f1-4462-b7bc-45ec9dad33f7-kube-api-access-hg6k7\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353716 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353757 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.353795 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.355306 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be758f6-763d-498b-8145-717da7e9c490-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.355668 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d249f723-309a-44c0-a674-cf1ae0531ba2-logs\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.359401 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.359421 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.359985 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.360557 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.360808 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.363168 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.364047 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.366447 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.374762 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxc5r\" (UniqueName: \"kubernetes.io/projected/d249f723-309a-44c0-a674-cf1ae0531ba2-kube-api-access-nxc5r\") pod \"watcher-kuttl-api-0\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.376984 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wws6r\" (UniqueName: \"kubernetes.io/projected/0be758f6-763d-498b-8145-717da7e9c490-kube-api-access-wws6r\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.425099 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.455639 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.455738 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe0138b-48f1-4462-b7bc-45ec9dad33f7-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.455771 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.455858 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg6k7\" (UniqueName: \"kubernetes.io/projected/afe0138b-48f1-4462-b7bc-45ec9dad33f7-kube-api-access-hg6k7\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.456375 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe0138b-48f1-4462-b7bc-45ec9dad33f7-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.461574 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.462286 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.470603 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.472986 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg6k7\" (UniqueName: \"kubernetes.io/projected/afe0138b-48f1-4462-b7bc-45ec9dad33f7-kube-api-access-hg6k7\") pod \"watcher-kuttl-applier-0\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.539287 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:18:58 crc kubenswrapper[4794]: I1215 14:18:58.929481 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:18:58 crc kubenswrapper[4794]: W1215 14:18:58.936133 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0be758f6_763d_498b_8145_717da7e9c490.slice/crio-33238bbb0d5886c73ebdfef4e6372055a7a75d1753bcc4c815f9fbefe05190e2 WatchSource:0}: Error finding container 33238bbb0d5886c73ebdfef4e6372055a7a75d1753bcc4c815f9fbefe05190e2: Status 404 returned error can't find the container with id 33238bbb0d5886c73ebdfef4e6372055a7a75d1753bcc4c815f9fbefe05190e2 Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.022036 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.027488 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:18:59 crc kubenswrapper[4794]: W1215 14:18:59.028330 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd249f723_309a_44c0_a674_cf1ae0531ba2.slice/crio-73d05de95e39512c7bbfead63b11cb484531b51d02c0c6d8c2b5ec3e6a123425 WatchSource:0}: Error finding container 73d05de95e39512c7bbfead63b11cb484531b51d02c0c6d8c2b5ec3e6a123425: Status 404 returned error can't find the container with id 73d05de95e39512c7bbfead63b11cb484531b51d02c0c6d8c2b5ec3e6a123425 Dec 15 14:18:59 crc kubenswrapper[4794]: W1215 14:18:59.030765 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafe0138b_48f1_4462_b7bc_45ec9dad33f7.slice/crio-e4b30d4d9e2d7cc12d63bf2e3e51ab5c50b92fb98ad4a6ba1d05eaf21bbfd78e WatchSource:0}: Error finding container e4b30d4d9e2d7cc12d63bf2e3e51ab5c50b92fb98ad4a6ba1d05eaf21bbfd78e: Status 404 returned error can't find the container with id e4b30d4d9e2d7cc12d63bf2e3e51ab5c50b92fb98ad4a6ba1d05eaf21bbfd78e Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.838218 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"afe0138b-48f1-4462-b7bc-45ec9dad33f7","Type":"ContainerStarted","Data":"622872382b47ecff74f94f5cb0869c4821a4950281ebfc32164884faecc40e77"} Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.838273 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"afe0138b-48f1-4462-b7bc-45ec9dad33f7","Type":"ContainerStarted","Data":"e4b30d4d9e2d7cc12d63bf2e3e51ab5c50b92fb98ad4a6ba1d05eaf21bbfd78e"} Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.843923 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0be758f6-763d-498b-8145-717da7e9c490","Type":"ContainerStarted","Data":"d8923a289af47c85af2e9969c8221202acd0d70e06addbb4105c03ffa88206e0"} Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.843976 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0be758f6-763d-498b-8145-717da7e9c490","Type":"ContainerStarted","Data":"33238bbb0d5886c73ebdfef4e6372055a7a75d1753bcc4c815f9fbefe05190e2"} Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.845643 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d249f723-309a-44c0-a674-cf1ae0531ba2","Type":"ContainerStarted","Data":"bbd4bc77466da6b99ef8e73dd28c5fba5aac5df9cd67dcb55d29fb13d2b00539"} Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.845677 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d249f723-309a-44c0-a674-cf1ae0531ba2","Type":"ContainerStarted","Data":"1a2d4061e9dcd01f5f5f94cd9d76a2a3d5bfb81508d733781873b817fbfe61dc"} Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.845687 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d249f723-309a-44c0-a674-cf1ae0531ba2","Type":"ContainerStarted","Data":"73d05de95e39512c7bbfead63b11cb484531b51d02c0c6d8c2b5ec3e6a123425"} Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.846279 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.865842 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.865817901 podStartE2EDuration="1.865817901s" podCreationTimestamp="2025-12-15 14:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:18:59.861906091 +0000 UTC m=+1501.713928529" watchObservedRunningTime="2025-12-15 14:18:59.865817901 +0000 UTC m=+1501.717840349" Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.904201 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.9041830659999999 podStartE2EDuration="1.904183066s" podCreationTimestamp="2025-12-15 14:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:18:59.88312264 +0000 UTC m=+1501.735145088" watchObservedRunningTime="2025-12-15 14:18:59.904183066 +0000 UTC m=+1501.756205504" Dec 15 14:18:59 crc kubenswrapper[4794]: I1215 14:18:59.928489 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=1.928465182 podStartE2EDuration="1.928465182s" podCreationTimestamp="2025-12-15 14:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:18:59.922292188 +0000 UTC m=+1501.774314626" watchObservedRunningTime="2025-12-15 14:18:59.928465182 +0000 UTC m=+1501.780487620" Dec 15 14:19:01 crc kubenswrapper[4794]: I1215 14:19:01.861411 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 14:19:02 crc kubenswrapper[4794]: I1215 14:19:02.024600 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:03 crc kubenswrapper[4794]: I1215 14:19:03.471098 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:03 crc kubenswrapper[4794]: I1215 14:19:03.541352 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:08 crc kubenswrapper[4794]: I1215 14:19:08.426293 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:08 crc kubenswrapper[4794]: I1215 14:19:08.451124 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:08 crc kubenswrapper[4794]: I1215 14:19:08.471341 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:08 crc kubenswrapper[4794]: I1215 14:19:08.480680 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:08 crc kubenswrapper[4794]: I1215 14:19:08.540325 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:08 crc kubenswrapper[4794]: I1215 14:19:08.570689 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:08 crc kubenswrapper[4794]: I1215 14:19:08.925899 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:08 crc kubenswrapper[4794]: I1215 14:19:08.944791 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:08 crc kubenswrapper[4794]: I1215 14:19:08.964279 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:08 crc kubenswrapper[4794]: I1215 14:19:08.979736 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:09 crc kubenswrapper[4794]: I1215 14:19:09.231484 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:11 crc kubenswrapper[4794]: I1215 14:19:11.086700 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:19:11 crc kubenswrapper[4794]: I1215 14:19:11.087219 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="ceilometer-central-agent" containerID="cri-o://2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c" gracePeriod=30 Dec 15 14:19:11 crc kubenswrapper[4794]: I1215 14:19:11.087314 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="sg-core" containerID="cri-o://3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85" gracePeriod=30 Dec 15 14:19:11 crc kubenswrapper[4794]: I1215 14:19:11.087330 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="proxy-httpd" containerID="cri-o://dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970" gracePeriod=30 Dec 15 14:19:11 crc kubenswrapper[4794]: I1215 14:19:11.087334 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="ceilometer-notification-agent" containerID="cri-o://8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c" gracePeriod=30 Dec 15 14:19:11 crc kubenswrapper[4794]: I1215 14:19:11.995672 4794 generic.go:334] "Generic (PLEG): container finished" podID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerID="dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970" exitCode=0 Dec 15 14:19:11 crc kubenswrapper[4794]: I1215 14:19:11.995959 4794 generic.go:334] "Generic (PLEG): container finished" podID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerID="3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85" exitCode=2 Dec 15 14:19:11 crc kubenswrapper[4794]: I1215 14:19:11.995968 4794 generic.go:334] "Generic (PLEG): container finished" podID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerID="2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c" exitCode=0 Dec 15 14:19:11 crc kubenswrapper[4794]: I1215 14:19:11.995992 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d1adb70b-8c07-4780-aab2-e793f3f85f63","Type":"ContainerDied","Data":"dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970"} Dec 15 14:19:11 crc kubenswrapper[4794]: I1215 14:19:11.996020 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d1adb70b-8c07-4780-aab2-e793f3f85f63","Type":"ContainerDied","Data":"3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85"} Dec 15 14:19:11 crc kubenswrapper[4794]: I1215 14:19:11.996031 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d1adb70b-8c07-4780-aab2-e793f3f85f63","Type":"ContainerDied","Data":"2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c"} Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.737306 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.848354 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-run-httpd\") pod \"d1adb70b-8c07-4780-aab2-e793f3f85f63\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.848413 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-log-httpd\") pod \"d1adb70b-8c07-4780-aab2-e793f3f85f63\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.848501 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-sg-core-conf-yaml\") pod \"d1adb70b-8c07-4780-aab2-e793f3f85f63\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.848544 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-ceilometer-tls-certs\") pod \"d1adb70b-8c07-4780-aab2-e793f3f85f63\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.848621 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-scripts\") pod \"d1adb70b-8c07-4780-aab2-e793f3f85f63\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.848894 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d1adb70b-8c07-4780-aab2-e793f3f85f63" (UID: "d1adb70b-8c07-4780-aab2-e793f3f85f63"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.848761 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d1adb70b-8c07-4780-aab2-e793f3f85f63" (UID: "d1adb70b-8c07-4780-aab2-e793f3f85f63"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.849353 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzdjz\" (UniqueName: \"kubernetes.io/projected/d1adb70b-8c07-4780-aab2-e793f3f85f63-kube-api-access-wzdjz\") pod \"d1adb70b-8c07-4780-aab2-e793f3f85f63\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.849440 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-combined-ca-bundle\") pod \"d1adb70b-8c07-4780-aab2-e793f3f85f63\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.849478 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-config-data\") pod \"d1adb70b-8c07-4780-aab2-e793f3f85f63\" (UID: \"d1adb70b-8c07-4780-aab2-e793f3f85f63\") " Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.849972 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.849996 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1adb70b-8c07-4780-aab2-e793f3f85f63-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.858784 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-scripts" (OuterVolumeSpecName: "scripts") pod "d1adb70b-8c07-4780-aab2-e793f3f85f63" (UID: "d1adb70b-8c07-4780-aab2-e793f3f85f63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.858815 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1adb70b-8c07-4780-aab2-e793f3f85f63-kube-api-access-wzdjz" (OuterVolumeSpecName: "kube-api-access-wzdjz") pod "d1adb70b-8c07-4780-aab2-e793f3f85f63" (UID: "d1adb70b-8c07-4780-aab2-e793f3f85f63"). InnerVolumeSpecName "kube-api-access-wzdjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.873969 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d1adb70b-8c07-4780-aab2-e793f3f85f63" (UID: "d1adb70b-8c07-4780-aab2-e793f3f85f63"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.892779 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d1adb70b-8c07-4780-aab2-e793f3f85f63" (UID: "d1adb70b-8c07-4780-aab2-e793f3f85f63"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.930607 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-config-data" (OuterVolumeSpecName: "config-data") pod "d1adb70b-8c07-4780-aab2-e793f3f85f63" (UID: "d1adb70b-8c07-4780-aab2-e793f3f85f63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.932434 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1adb70b-8c07-4780-aab2-e793f3f85f63" (UID: "d1adb70b-8c07-4780-aab2-e793f3f85f63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.951086 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.951133 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.951146 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.951157 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzdjz\" (UniqueName: \"kubernetes.io/projected/d1adb70b-8c07-4780-aab2-e793f3f85f63-kube-api-access-wzdjz\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.951171 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:12 crc kubenswrapper[4794]: I1215 14:19:12.951183 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1adb70b-8c07-4780-aab2-e793f3f85f63-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.005681 4794 generic.go:334] "Generic (PLEG): container finished" podID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerID="8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c" exitCode=0 Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.005719 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d1adb70b-8c07-4780-aab2-e793f3f85f63","Type":"ContainerDied","Data":"8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c"} Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.005743 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d1adb70b-8c07-4780-aab2-e793f3f85f63","Type":"ContainerDied","Data":"52581c3642cbc3f3c52904462191a10d4887efb1ee2ad86c6b08982b07fe77f3"} Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.005791 4794 scope.go:117] "RemoveContainer" containerID="dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.005927 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.044363 4794 scope.go:117] "RemoveContainer" containerID="3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.049403 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.058514 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.070433 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:19:13 crc kubenswrapper[4794]: E1215 14:19:13.070912 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="ceilometer-central-agent" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.070934 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="ceilometer-central-agent" Dec 15 14:19:13 crc kubenswrapper[4794]: E1215 14:19:13.070953 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="ceilometer-notification-agent" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.070961 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="ceilometer-notification-agent" Dec 15 14:19:13 crc kubenswrapper[4794]: E1215 14:19:13.070973 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="proxy-httpd" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.070981 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="proxy-httpd" Dec 15 14:19:13 crc kubenswrapper[4794]: E1215 14:19:13.070994 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="sg-core" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.071001 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="sg-core" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.071175 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="proxy-httpd" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.071195 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="sg-core" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.071215 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="ceilometer-notification-agent" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.071232 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" containerName="ceilometer-central-agent" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.073004 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.077182 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.077335 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.077616 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.081880 4794 scope.go:117] "RemoveContainer" containerID="8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.086612 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.114891 4794 scope.go:117] "RemoveContainer" containerID="2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.132461 4794 scope.go:117] "RemoveContainer" containerID="dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970" Dec 15 14:19:13 crc kubenswrapper[4794]: E1215 14:19:13.133795 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970\": container with ID starting with dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970 not found: ID does not exist" containerID="dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.133844 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970"} err="failed to get container status \"dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970\": rpc error: code = NotFound desc = could not find container \"dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970\": container with ID starting with dafe6460af6813c7069e793b74694cb5b03b321f62e888323206b7522801f970 not found: ID does not exist" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.133870 4794 scope.go:117] "RemoveContainer" containerID="3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85" Dec 15 14:19:13 crc kubenswrapper[4794]: E1215 14:19:13.134309 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85\": container with ID starting with 3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85 not found: ID does not exist" containerID="3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.134338 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85"} err="failed to get container status \"3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85\": rpc error: code = NotFound desc = could not find container \"3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85\": container with ID starting with 3bd152746564d9ffb081f60fc8205e61bfe4de13fd6566e395bf572b87299e85 not found: ID does not exist" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.134387 4794 scope.go:117] "RemoveContainer" containerID="8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c" Dec 15 14:19:13 crc kubenswrapper[4794]: E1215 14:19:13.134768 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c\": container with ID starting with 8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c not found: ID does not exist" containerID="8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.134801 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c"} err="failed to get container status \"8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c\": rpc error: code = NotFound desc = could not find container \"8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c\": container with ID starting with 8af75c692fc4dff7cc79ea2a00db96946a06db649a8973f8879a61daed82c15c not found: ID does not exist" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.134827 4794 scope.go:117] "RemoveContainer" containerID="2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c" Dec 15 14:19:13 crc kubenswrapper[4794]: E1215 14:19:13.135121 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c\": container with ID starting with 2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c not found: ID does not exist" containerID="2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.135146 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c"} err="failed to get container status \"2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c\": rpc error: code = NotFound desc = could not find container \"2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c\": container with ID starting with 2437c80cd5cd72a7e818744304faaa3de71901488e1910ff7c0b22da3f73f35c not found: ID does not exist" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.255878 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knggn\" (UniqueName: \"kubernetes.io/projected/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-kube-api-access-knggn\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.255984 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-log-httpd\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.256058 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-scripts\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.256103 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.256187 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.256341 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.256436 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-run-httpd\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.256513 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-config-data\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.357118 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.357166 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-run-httpd\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.357192 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-config-data\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.357229 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knggn\" (UniqueName: \"kubernetes.io/projected/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-kube-api-access-knggn\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.357266 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-log-httpd\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.357300 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-scripts\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.357323 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.357362 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.357696 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-run-httpd\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.357772 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-log-httpd\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.361421 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-config-data\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.363121 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.363712 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.364824 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-scripts\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.365002 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.373258 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knggn\" (UniqueName: \"kubernetes.io/projected/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-kube-api-access-knggn\") pod \"ceilometer-0\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.401972 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:13 crc kubenswrapper[4794]: I1215 14:19:13.911816 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:19:14 crc kubenswrapper[4794]: I1215 14:19:14.014812 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7","Type":"ContainerStarted","Data":"388b771aa4726d0c4082501d32b669418f7da05f77069b5e485eeff76e8f253e"} Dec 15 14:19:14 crc kubenswrapper[4794]: I1215 14:19:14.747393 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1adb70b-8c07-4780-aab2-e793f3f85f63" path="/var/lib/kubelet/pods/d1adb70b-8c07-4780-aab2-e793f3f85f63/volumes" Dec 15 14:19:15 crc kubenswrapper[4794]: I1215 14:19:15.022998 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7","Type":"ContainerStarted","Data":"7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc"} Dec 15 14:19:16 crc kubenswrapper[4794]: I1215 14:19:16.031477 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7","Type":"ContainerStarted","Data":"aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af"} Dec 15 14:19:17 crc kubenswrapper[4794]: I1215 14:19:17.044153 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7","Type":"ContainerStarted","Data":"cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5"} Dec 15 14:19:18 crc kubenswrapper[4794]: I1215 14:19:18.054738 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7","Type":"ContainerStarted","Data":"ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da"} Dec 15 14:19:18 crc kubenswrapper[4794]: I1215 14:19:18.056333 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:18 crc kubenswrapper[4794]: I1215 14:19:18.077261 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.5434890719999999 podStartE2EDuration="5.077243467s" podCreationTimestamp="2025-12-15 14:19:13 +0000 UTC" firstStartedPulling="2025-12-15 14:19:13.934609188 +0000 UTC m=+1515.786631626" lastFinishedPulling="2025-12-15 14:19:17.468363533 +0000 UTC m=+1519.320386021" observedRunningTime="2025-12-15 14:19:18.074508 +0000 UTC m=+1519.926530468" watchObservedRunningTime="2025-12-15 14:19:18.077243467 +0000 UTC m=+1519.929265925" Dec 15 14:19:22 crc kubenswrapper[4794]: I1215 14:19:22.905765 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 15 14:19:22 crc kubenswrapper[4794]: I1215 14:19:22.906528 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/memcached-0" podUID="6abf8491-0715-4c16-9653-fecbcfd68ed0" containerName="memcached" containerID="cri-o://720d2f6082eec541d657b9f1354711b973744db66bbff859554cfe83c3a2ba52" gracePeriod=30 Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.000206 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.000439 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerName="watcher-kuttl-api-log" containerID="cri-o://1a2d4061e9dcd01f5f5f94cd9d76a2a3d5bfb81508d733781873b817fbfe61dc" gracePeriod=30 Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.000617 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerName="watcher-api" containerID="cri-o://bbd4bc77466da6b99ef8e73dd28c5fba5aac5df9cd67dcb55d29fb13d2b00539" gracePeriod=30 Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.053893 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.054119 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="0be758f6-763d-498b-8145-717da7e9c490" containerName="watcher-decision-engine" containerID="cri-o://d8923a289af47c85af2e9969c8221202acd0d70e06addbb4105c03ffa88206e0" gracePeriod=30 Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.060773 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.060995 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="afe0138b-48f1-4462-b7bc-45ec9dad33f7" containerName="watcher-applier" containerID="cri-o://622872382b47ecff74f94f5cb0869c4821a4950281ebfc32164884faecc40e77" gracePeriod=30 Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.104771 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-cnrfp"] Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.125025 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-cnrfp"] Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.196778 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-hmftj"] Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.197835 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.202670 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-mtls" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.216685 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-hmftj"] Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.330422 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-scripts\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.330503 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-config-data\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.330611 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-fernet-keys\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.330632 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-cert-memcached-mtls\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.331185 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-combined-ca-bundle\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.331244 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-credential-keys\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.331306 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmghn\" (UniqueName: \"kubernetes.io/projected/b758c36f-801e-4cb6-a477-d6a81d9d04cd-kube-api-access-pmghn\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.432901 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-config-data\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.432998 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-cert-memcached-mtls\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.433020 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-fernet-keys\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.433069 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-combined-ca-bundle\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.433094 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-credential-keys\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.433128 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmghn\" (UniqueName: \"kubernetes.io/projected/b758c36f-801e-4cb6-a477-d6a81d9d04cd-kube-api-access-pmghn\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.433166 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-scripts\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.439250 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-credential-keys\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.439415 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-config-data\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.439710 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-fernet-keys\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.443954 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-scripts\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.448792 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-cert-memcached-mtls\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.455598 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-combined-ca-bundle\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.463700 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmghn\" (UniqueName: \"kubernetes.io/projected/b758c36f-801e-4cb6-a477-d6a81d9d04cd-kube-api-access-pmghn\") pod \"keystone-bootstrap-hmftj\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.518725 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:23 crc kubenswrapper[4794]: E1215 14:19:23.550940 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="622872382b47ecff74f94f5cb0869c4821a4950281ebfc32164884faecc40e77" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:19:23 crc kubenswrapper[4794]: E1215 14:19:23.552530 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="622872382b47ecff74f94f5cb0869c4821a4950281ebfc32164884faecc40e77" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:19:23 crc kubenswrapper[4794]: E1215 14:19:23.553624 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="622872382b47ecff74f94f5cb0869c4821a4950281ebfc32164884faecc40e77" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:19:23 crc kubenswrapper[4794]: E1215 14:19:23.553647 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="afe0138b-48f1-4462-b7bc-45ec9dad33f7" containerName="watcher-applier" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.863235 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.158:9322/\": read tcp 10.217.0.2:34624->10.217.0.158:9322: read: connection reset by peer" Dec 15 14:19:23 crc kubenswrapper[4794]: I1215 14:19:23.863848 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"https://10.217.0.158:9322/\": read tcp 10.217.0.2:34638->10.217.0.158:9322: read: connection reset by peer" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.029871 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-hmftj"] Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.146648 4794 generic.go:334] "Generic (PLEG): container finished" podID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerID="bbd4bc77466da6b99ef8e73dd28c5fba5aac5df9cd67dcb55d29fb13d2b00539" exitCode=0 Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.146690 4794 generic.go:334] "Generic (PLEG): container finished" podID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerID="1a2d4061e9dcd01f5f5f94cd9d76a2a3d5bfb81508d733781873b817fbfe61dc" exitCode=143 Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.147033 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d249f723-309a-44c0-a674-cf1ae0531ba2","Type":"ContainerDied","Data":"bbd4bc77466da6b99ef8e73dd28c5fba5aac5df9cd67dcb55d29fb13d2b00539"} Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.147075 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d249f723-309a-44c0-a674-cf1ae0531ba2","Type":"ContainerDied","Data":"1a2d4061e9dcd01f5f5f94cd9d76a2a3d5bfb81508d733781873b817fbfe61dc"} Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.156692 4794 generic.go:334] "Generic (PLEG): container finished" podID="6abf8491-0715-4c16-9653-fecbcfd68ed0" containerID="720d2f6082eec541d657b9f1354711b973744db66bbff859554cfe83c3a2ba52" exitCode=0 Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.156941 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"6abf8491-0715-4c16-9653-fecbcfd68ed0","Type":"ContainerDied","Data":"720d2f6082eec541d657b9f1354711b973744db66bbff859554cfe83c3a2ba52"} Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.159933 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-hmftj" event={"ID":"b758c36f-801e-4cb6-a477-d6a81d9d04cd","Type":"ContainerStarted","Data":"67d6f18738c387ebcc6f59d3b5fa1283ed229a1140ce984bf2a0c63a0900c5e1"} Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.287297 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.456766 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-custom-prometheus-ca\") pod \"d249f723-309a-44c0-a674-cf1ae0531ba2\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.456896 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-config-data\") pod \"d249f723-309a-44c0-a674-cf1ae0531ba2\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.456936 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d249f723-309a-44c0-a674-cf1ae0531ba2-logs\") pod \"d249f723-309a-44c0-a674-cf1ae0531ba2\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.456986 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-internal-tls-certs\") pod \"d249f723-309a-44c0-a674-cf1ae0531ba2\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.457015 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxc5r\" (UniqueName: \"kubernetes.io/projected/d249f723-309a-44c0-a674-cf1ae0531ba2-kube-api-access-nxc5r\") pod \"d249f723-309a-44c0-a674-cf1ae0531ba2\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.457031 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-combined-ca-bundle\") pod \"d249f723-309a-44c0-a674-cf1ae0531ba2\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.457048 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-public-tls-certs\") pod \"d249f723-309a-44c0-a674-cf1ae0531ba2\" (UID: \"d249f723-309a-44c0-a674-cf1ae0531ba2\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.458739 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d249f723-309a-44c0-a674-cf1ae0531ba2-logs" (OuterVolumeSpecName: "logs") pod "d249f723-309a-44c0-a674-cf1ae0531ba2" (UID: "d249f723-309a-44c0-a674-cf1ae0531ba2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.471496 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d249f723-309a-44c0-a674-cf1ae0531ba2-kube-api-access-nxc5r" (OuterVolumeSpecName: "kube-api-access-nxc5r") pod "d249f723-309a-44c0-a674-cf1ae0531ba2" (UID: "d249f723-309a-44c0-a674-cf1ae0531ba2"). InnerVolumeSpecName "kube-api-access-nxc5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.486926 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "d249f723-309a-44c0-a674-cf1ae0531ba2" (UID: "d249f723-309a-44c0-a674-cf1ae0531ba2"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.492419 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d249f723-309a-44c0-a674-cf1ae0531ba2" (UID: "d249f723-309a-44c0-a674-cf1ae0531ba2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.514105 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d249f723-309a-44c0-a674-cf1ae0531ba2" (UID: "d249f723-309a-44c0-a674-cf1ae0531ba2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.525598 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-config-data" (OuterVolumeSpecName: "config-data") pod "d249f723-309a-44c0-a674-cf1ae0531ba2" (UID: "d249f723-309a-44c0-a674-cf1ae0531ba2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.528746 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d249f723-309a-44c0-a674-cf1ae0531ba2" (UID: "d249f723-309a-44c0-a674-cf1ae0531ba2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.534200 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.534259 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.558982 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.559025 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d249f723-309a-44c0-a674-cf1ae0531ba2-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.559056 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.559072 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.559084 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxc5r\" (UniqueName: \"kubernetes.io/projected/d249f723-309a-44c0-a674-cf1ae0531ba2-kube-api-access-nxc5r\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.559095 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.559105 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d249f723-309a-44c0-a674-cf1ae0531ba2-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.693922 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.751927 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df16e0e7-1842-4fef-9827-42ddbe237256" path="/var/lib/kubelet/pods/df16e0e7-1842-4fef-9827-42ddbe237256/volumes" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.862996 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-combined-ca-bundle\") pod \"6abf8491-0715-4c16-9653-fecbcfd68ed0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.863051 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-memcached-tls-certs\") pod \"6abf8491-0715-4c16-9653-fecbcfd68ed0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.863094 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-config-data\") pod \"6abf8491-0715-4c16-9653-fecbcfd68ed0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.863154 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-kolla-config\") pod \"6abf8491-0715-4c16-9653-fecbcfd68ed0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.863184 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tp9r\" (UniqueName: \"kubernetes.io/projected/6abf8491-0715-4c16-9653-fecbcfd68ed0-kube-api-access-5tp9r\") pod \"6abf8491-0715-4c16-9653-fecbcfd68ed0\" (UID: \"6abf8491-0715-4c16-9653-fecbcfd68ed0\") " Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.871531 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-config-data" (OuterVolumeSpecName: "config-data") pod "6abf8491-0715-4c16-9653-fecbcfd68ed0" (UID: "6abf8491-0715-4c16-9653-fecbcfd68ed0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.871740 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6abf8491-0715-4c16-9653-fecbcfd68ed0" (UID: "6abf8491-0715-4c16-9653-fecbcfd68ed0"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.877226 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6abf8491-0715-4c16-9653-fecbcfd68ed0-kube-api-access-5tp9r" (OuterVolumeSpecName: "kube-api-access-5tp9r") pod "6abf8491-0715-4c16-9653-fecbcfd68ed0" (UID: "6abf8491-0715-4c16-9653-fecbcfd68ed0"). InnerVolumeSpecName "kube-api-access-5tp9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.886014 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6abf8491-0715-4c16-9653-fecbcfd68ed0" (UID: "6abf8491-0715-4c16-9653-fecbcfd68ed0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.911406 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "6abf8491-0715-4c16-9653-fecbcfd68ed0" (UID: "6abf8491-0715-4c16-9653-fecbcfd68ed0"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.968639 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.968678 4794 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6abf8491-0715-4c16-9653-fecbcfd68ed0-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.968690 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.968701 4794 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6abf8491-0715-4c16-9653-fecbcfd68ed0-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:24 crc kubenswrapper[4794]: I1215 14:19:24.968713 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tp9r\" (UniqueName: \"kubernetes.io/projected/6abf8491-0715-4c16-9653-fecbcfd68ed0-kube-api-access-5tp9r\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.170166 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-hmftj" event={"ID":"b758c36f-801e-4cb6-a477-d6a81d9d04cd","Type":"ContainerStarted","Data":"0458b18c5616718b1599c32062ac677c77b74357e975dc80212cb5a039a5fb84"} Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.173217 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.173216 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"d249f723-309a-44c0-a674-cf1ae0531ba2","Type":"ContainerDied","Data":"73d05de95e39512c7bbfead63b11cb484531b51d02c0c6d8c2b5ec3e6a123425"} Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.173404 4794 scope.go:117] "RemoveContainer" containerID="bbd4bc77466da6b99ef8e73dd28c5fba5aac5df9cd67dcb55d29fb13d2b00539" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.175066 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"6abf8491-0715-4c16-9653-fecbcfd68ed0","Type":"ContainerDied","Data":"b6bf49192031bd58afdeee7f83ea49488deb805ddc0680dee6324a71482e5813"} Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.175113 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.198935 4794 scope.go:117] "RemoveContainer" containerID="1a2d4061e9dcd01f5f5f94cd9d76a2a3d5bfb81508d733781873b817fbfe61dc" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.216793 4794 scope.go:117] "RemoveContainer" containerID="720d2f6082eec541d657b9f1354711b973744db66bbff859554cfe83c3a2ba52" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.221590 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-hmftj" podStartSLOduration=2.221564449 podStartE2EDuration="2.221564449s" podCreationTimestamp="2025-12-15 14:19:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:19:25.217011971 +0000 UTC m=+1527.069034409" watchObservedRunningTime="2025-12-15 14:19:25.221564449 +0000 UTC m=+1527.073586887" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.247913 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.256523 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.284061 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.294712 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.306763 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:25 crc kubenswrapper[4794]: E1215 14:19:25.307149 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6abf8491-0715-4c16-9653-fecbcfd68ed0" containerName="memcached" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.307168 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6abf8491-0715-4c16-9653-fecbcfd68ed0" containerName="memcached" Dec 15 14:19:25 crc kubenswrapper[4794]: E1215 14:19:25.307185 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerName="watcher-api" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.307193 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerName="watcher-api" Dec 15 14:19:25 crc kubenswrapper[4794]: E1215 14:19:25.307214 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerName="watcher-kuttl-api-log" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.307224 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerName="watcher-kuttl-api-log" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.307419 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerName="watcher-api" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.307450 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d249f723-309a-44c0-a674-cf1ae0531ba2" containerName="watcher-kuttl-api-log" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.307474 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6abf8491-0715-4c16-9653-fecbcfd68ed0" containerName="memcached" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.308495 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.310829 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.310988 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.311105 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.314206 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.315449 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.318572 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"memcached-memcached-dockercfg-z2hg5" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.318875 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"memcached-config-data" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.319032 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-svc" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.331336 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.356778 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475641 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m76p6\" (UniqueName: \"kubernetes.io/projected/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-kube-api-access-m76p6\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475698 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475724 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475753 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475782 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475805 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475835 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-kolla-config\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475879 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bec18956-c880-4070-bf23-4456f4a38042-logs\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475904 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-config-data\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475926 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475944 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7ngh\" (UniqueName: \"kubernetes.io/projected/bec18956-c880-4070-bf23-4456f4a38042-kube-api-access-t7ngh\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.475968 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.476001 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.577852 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bec18956-c880-4070-bf23-4456f4a38042-logs\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.577912 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-config-data\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.577944 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.577969 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7ngh\" (UniqueName: \"kubernetes.io/projected/bec18956-c880-4070-bf23-4456f4a38042-kube-api-access-t7ngh\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.577996 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.578032 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.578105 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m76p6\" (UniqueName: \"kubernetes.io/projected/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-kube-api-access-m76p6\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.578134 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.578216 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.578264 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.578290 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.578321 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.578362 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-kolla-config\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.578364 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bec18956-c880-4070-bf23-4456f4a38042-logs\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.578967 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-config-data\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.580408 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-kolla-config\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.582796 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.582928 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.583097 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.583360 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.590155 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.595294 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.598229 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.599290 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.601758 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m76p6\" (UniqueName: \"kubernetes.io/projected/52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4-kube-api-access-m76p6\") pod \"memcached-0\" (UID: \"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4\") " pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.609523 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7ngh\" (UniqueName: \"kubernetes.io/projected/bec18956-c880-4070-bf23-4456f4a38042-kube-api-access-t7ngh\") pod \"watcher-kuttl-api-0\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.627514 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:25 crc kubenswrapper[4794]: I1215 14:19:25.645851 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.184465 4794 generic.go:334] "Generic (PLEG): container finished" podID="afe0138b-48f1-4462-b7bc-45ec9dad33f7" containerID="622872382b47ecff74f94f5cb0869c4821a4950281ebfc32164884faecc40e77" exitCode=0 Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.184553 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"afe0138b-48f1-4462-b7bc-45ec9dad33f7","Type":"ContainerDied","Data":"622872382b47ecff74f94f5cb0869c4821a4950281ebfc32164884faecc40e77"} Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.188459 4794 generic.go:334] "Generic (PLEG): container finished" podID="0be758f6-763d-498b-8145-717da7e9c490" containerID="d8923a289af47c85af2e9969c8221202acd0d70e06addbb4105c03ffa88206e0" exitCode=0 Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.188518 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0be758f6-763d-498b-8145-717da7e9c490","Type":"ContainerDied","Data":"d8923a289af47c85af2e9969c8221202acd0d70e06addbb4105c03ffa88206e0"} Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.202113 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.295663 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Dec 15 14:19:26 crc kubenswrapper[4794]: W1215 14:19:26.299318 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52de7ae7_7c3b_4931_9313_cc3ebbe1a4f4.slice/crio-854bd0713d25a566b59369ba56d09f7811adcfa055c19c2be8f3f1a6fffd40bf WatchSource:0}: Error finding container 854bd0713d25a566b59369ba56d09f7811adcfa055c19c2be8f3f1a6fffd40bf: Status 404 returned error can't find the container with id 854bd0713d25a566b59369ba56d09f7811adcfa055c19c2be8f3f1a6fffd40bf Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.564904 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.569538 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.705736 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-config-data\") pod \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.706054 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-combined-ca-bundle\") pod \"0be758f6-763d-498b-8145-717da7e9c490\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.706077 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe0138b-48f1-4462-b7bc-45ec9dad33f7-logs\") pod \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.706098 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-custom-prometheus-ca\") pod \"0be758f6-763d-498b-8145-717da7e9c490\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.706134 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wws6r\" (UniqueName: \"kubernetes.io/projected/0be758f6-763d-498b-8145-717da7e9c490-kube-api-access-wws6r\") pod \"0be758f6-763d-498b-8145-717da7e9c490\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.706199 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-config-data\") pod \"0be758f6-763d-498b-8145-717da7e9c490\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.706223 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be758f6-763d-498b-8145-717da7e9c490-logs\") pod \"0be758f6-763d-498b-8145-717da7e9c490\" (UID: \"0be758f6-763d-498b-8145-717da7e9c490\") " Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.706288 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg6k7\" (UniqueName: \"kubernetes.io/projected/afe0138b-48f1-4462-b7bc-45ec9dad33f7-kube-api-access-hg6k7\") pod \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.706344 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-combined-ca-bundle\") pod \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\" (UID: \"afe0138b-48f1-4462-b7bc-45ec9dad33f7\") " Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.706403 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afe0138b-48f1-4462-b7bc-45ec9dad33f7-logs" (OuterVolumeSpecName: "logs") pod "afe0138b-48f1-4462-b7bc-45ec9dad33f7" (UID: "afe0138b-48f1-4462-b7bc-45ec9dad33f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.706650 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be758f6-763d-498b-8145-717da7e9c490-logs" (OuterVolumeSpecName: "logs") pod "0be758f6-763d-498b-8145-717da7e9c490" (UID: "0be758f6-763d-498b-8145-717da7e9c490"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.706661 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe0138b-48f1-4462-b7bc-45ec9dad33f7-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.709423 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be758f6-763d-498b-8145-717da7e9c490-kube-api-access-wws6r" (OuterVolumeSpecName: "kube-api-access-wws6r") pod "0be758f6-763d-498b-8145-717da7e9c490" (UID: "0be758f6-763d-498b-8145-717da7e9c490"). InnerVolumeSpecName "kube-api-access-wws6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.709834 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe0138b-48f1-4462-b7bc-45ec9dad33f7-kube-api-access-hg6k7" (OuterVolumeSpecName: "kube-api-access-hg6k7") pod "afe0138b-48f1-4462-b7bc-45ec9dad33f7" (UID: "afe0138b-48f1-4462-b7bc-45ec9dad33f7"). InnerVolumeSpecName "kube-api-access-hg6k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.733993 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "0be758f6-763d-498b-8145-717da7e9c490" (UID: "0be758f6-763d-498b-8145-717da7e9c490"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.736384 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0be758f6-763d-498b-8145-717da7e9c490" (UID: "0be758f6-763d-498b-8145-717da7e9c490"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.739810 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afe0138b-48f1-4462-b7bc-45ec9dad33f7" (UID: "afe0138b-48f1-4462-b7bc-45ec9dad33f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.748351 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6abf8491-0715-4c16-9653-fecbcfd68ed0" path="/var/lib/kubelet/pods/6abf8491-0715-4c16-9653-fecbcfd68ed0/volumes" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.748926 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d249f723-309a-44c0-a674-cf1ae0531ba2" path="/var/lib/kubelet/pods/d249f723-309a-44c0-a674-cf1ae0531ba2/volumes" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.753823 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-config-data" (OuterVolumeSpecName: "config-data") pod "0be758f6-763d-498b-8145-717da7e9c490" (UID: "0be758f6-763d-498b-8145-717da7e9c490"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.761800 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-config-data" (OuterVolumeSpecName: "config-data") pod "afe0138b-48f1-4462-b7bc-45ec9dad33f7" (UID: "afe0138b-48f1-4462-b7bc-45ec9dad33f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.807631 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg6k7\" (UniqueName: \"kubernetes.io/projected/afe0138b-48f1-4462-b7bc-45ec9dad33f7-kube-api-access-hg6k7\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.807663 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.807675 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe0138b-48f1-4462-b7bc-45ec9dad33f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.807684 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.807694 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.807702 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wws6r\" (UniqueName: \"kubernetes.io/projected/0be758f6-763d-498b-8145-717da7e9c490-kube-api-access-wws6r\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.807711 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be758f6-763d-498b-8145-717da7e9c490-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:26 crc kubenswrapper[4794]: I1215 14:19:26.807719 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be758f6-763d-498b-8145-717da7e9c490-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.215685 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bec18956-c880-4070-bf23-4456f4a38042","Type":"ContainerStarted","Data":"b7170e82f6c25554ee74cb6927f2449703a12a5d33d1fc961509c0443e5ee199"} Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.216766 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4","Type":"ContainerStarted","Data":"854bd0713d25a566b59369ba56d09f7811adcfa055c19c2be8f3f1a6fffd40bf"} Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.218712 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"afe0138b-48f1-4462-b7bc-45ec9dad33f7","Type":"ContainerDied","Data":"e4b30d4d9e2d7cc12d63bf2e3e51ab5c50b92fb98ad4a6ba1d05eaf21bbfd78e"} Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.218749 4794 scope.go:117] "RemoveContainer" containerID="622872382b47ecff74f94f5cb0869c4821a4950281ebfc32164884faecc40e77" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.218835 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.225157 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0be758f6-763d-498b-8145-717da7e9c490","Type":"ContainerDied","Data":"33238bbb0d5886c73ebdfef4e6372055a7a75d1753bcc4c815f9fbefe05190e2"} Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.225194 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.246552 4794 scope.go:117] "RemoveContainer" containerID="d8923a289af47c85af2e9969c8221202acd0d70e06addbb4105c03ffa88206e0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.255896 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.283107 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.327797 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.335047 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.353056 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:19:27 crc kubenswrapper[4794]: E1215 14:19:27.353552 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe0138b-48f1-4462-b7bc-45ec9dad33f7" containerName="watcher-applier" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.353573 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe0138b-48f1-4462-b7bc-45ec9dad33f7" containerName="watcher-applier" Dec 15 14:19:27 crc kubenswrapper[4794]: E1215 14:19:27.353628 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be758f6-763d-498b-8145-717da7e9c490" containerName="watcher-decision-engine" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.353637 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be758f6-763d-498b-8145-717da7e9c490" containerName="watcher-decision-engine" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.353860 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe0138b-48f1-4462-b7bc-45ec9dad33f7" containerName="watcher-applier" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.353881 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be758f6-763d-498b-8145-717da7e9c490" containerName="watcher-decision-engine" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.354646 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.356703 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.361357 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.369889 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.371163 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.373148 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.380486 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.417155 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.417209 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.417243 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn54g\" (UniqueName: \"kubernetes.io/projected/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-kube-api-access-nn54g\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.417280 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.417335 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.417400 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.417415 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9z6m\" (UniqueName: \"kubernetes.io/projected/4bf1c253-9509-4a95-8840-c1ed039c0e9d-kube-api-access-t9z6m\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.417530 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bf1c253-9509-4a95-8840-c1ed039c0e9d-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.417687 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.417864 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.417967 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519023 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519084 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519126 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519148 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519166 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn54g\" (UniqueName: \"kubernetes.io/projected/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-kube-api-access-nn54g\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519187 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519213 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519247 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519263 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9z6m\" (UniqueName: \"kubernetes.io/projected/4bf1c253-9509-4a95-8840-c1ed039c0e9d-kube-api-access-t9z6m\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519284 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bf1c253-9509-4a95-8840-c1ed039c0e9d-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519314 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.519711 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.522365 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.524078 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.524226 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.524394 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bf1c253-9509-4a95-8840-c1ed039c0e9d-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.525655 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.527154 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.527819 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.527912 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.544416 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9z6m\" (UniqueName: \"kubernetes.io/projected/4bf1c253-9509-4a95-8840-c1ed039c0e9d-kube-api-access-t9z6m\") pod \"watcher-kuttl-applier-0\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.551088 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn54g\" (UniqueName: \"kubernetes.io/projected/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-kube-api-access-nn54g\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.687561 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:27 crc kubenswrapper[4794]: I1215 14:19:27.700491 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.180098 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:19:28 crc kubenswrapper[4794]: W1215 14:19:28.183547 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ff1d9d7_73eb_4d94_bcfa_6c8e72f85a0c.slice/crio-c51bf9b037d99d307161fbcd297db6cd4bafe6478811a72b7abc2f13de8f9d3d WatchSource:0}: Error finding container c51bf9b037d99d307161fbcd297db6cd4bafe6478811a72b7abc2f13de8f9d3d: Status 404 returned error can't find the container with id c51bf9b037d99d307161fbcd297db6cd4bafe6478811a72b7abc2f13de8f9d3d Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.220761 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.268111 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bec18956-c880-4070-bf23-4456f4a38042","Type":"ContainerStarted","Data":"bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4"} Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.268163 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bec18956-c880-4070-bf23-4456f4a38042","Type":"ContainerStarted","Data":"4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15"} Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.269206 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.278079 4794 generic.go:334] "Generic (PLEG): container finished" podID="b758c36f-801e-4cb6-a477-d6a81d9d04cd" containerID="0458b18c5616718b1599c32062ac677c77b74357e975dc80212cb5a039a5fb84" exitCode=0 Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.278174 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-hmftj" event={"ID":"b758c36f-801e-4cb6-a477-d6a81d9d04cd","Type":"ContainerDied","Data":"0458b18c5616718b1599c32062ac677c77b74357e975dc80212cb5a039a5fb84"} Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.279518 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c","Type":"ContainerStarted","Data":"c51bf9b037d99d307161fbcd297db6cd4bafe6478811a72b7abc2f13de8f9d3d"} Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.280524 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4","Type":"ContainerStarted","Data":"c8f4ac3bf9aa768e6857a8c05b13396088d63f12384f1966104e00319a6fb4f8"} Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.281176 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.308288 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=3.308270015 podStartE2EDuration="3.308270015s" podCreationTimestamp="2025-12-15 14:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:19:28.297160721 +0000 UTC m=+1530.149183159" watchObservedRunningTime="2025-12-15 14:19:28.308270015 +0000 UTC m=+1530.160292453" Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.361753 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/memcached-0" podStartSLOduration=3.361731737 podStartE2EDuration="3.361731737s" podCreationTimestamp="2025-12-15 14:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:19:28.315840769 +0000 UTC m=+1530.167863207" watchObservedRunningTime="2025-12-15 14:19:28.361731737 +0000 UTC m=+1530.213754185" Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.746254 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be758f6-763d-498b-8145-717da7e9c490" path="/var/lib/kubelet/pods/0be758f6-763d-498b-8145-717da7e9c490/volumes" Dec 15 14:19:28 crc kubenswrapper[4794]: I1215 14:19:28.749716 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe0138b-48f1-4462-b7bc-45ec9dad33f7" path="/var/lib/kubelet/pods/afe0138b-48f1-4462-b7bc-45ec9dad33f7/volumes" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.294534 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"4bf1c253-9509-4a95-8840-c1ed039c0e9d","Type":"ContainerStarted","Data":"9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591"} Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.294647 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"4bf1c253-9509-4a95-8840-c1ed039c0e9d","Type":"ContainerStarted","Data":"3c41b80c177bfcaaf1f05ac936077232fc6f3b53167b5ea92be4e07b75282c6d"} Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.297256 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c","Type":"ContainerStarted","Data":"f4b8fe10c4821d2cf7ce18ea6b3d58da541216e1e74ef94543a67f19624b6733"} Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.316530 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.316508309 podStartE2EDuration="2.316508309s" podCreationTimestamp="2025-12-15 14:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:19:29.314142232 +0000 UTC m=+1531.166164660" watchObservedRunningTime="2025-12-15 14:19:29.316508309 +0000 UTC m=+1531.168530757" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.345341 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.345323283 podStartE2EDuration="2.345323283s" podCreationTimestamp="2025-12-15 14:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:19:29.336262517 +0000 UTC m=+1531.188284955" watchObservedRunningTime="2025-12-15 14:19:29.345323283 +0000 UTC m=+1531.197345741" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.721782 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.762073 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-cert-memcached-mtls\") pod \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.762117 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-config-data\") pod \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.762148 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-fernet-keys\") pod \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.762173 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-credential-keys\") pod \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.762224 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmghn\" (UniqueName: \"kubernetes.io/projected/b758c36f-801e-4cb6-a477-d6a81d9d04cd-kube-api-access-pmghn\") pod \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.762282 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-combined-ca-bundle\") pod \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.762321 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-scripts\") pod \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\" (UID: \"b758c36f-801e-4cb6-a477-d6a81d9d04cd\") " Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.804171 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b758c36f-801e-4cb6-a477-d6a81d9d04cd" (UID: "b758c36f-801e-4cb6-a477-d6a81d9d04cd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.806236 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b758c36f-801e-4cb6-a477-d6a81d9d04cd-kube-api-access-pmghn" (OuterVolumeSpecName: "kube-api-access-pmghn") pod "b758c36f-801e-4cb6-a477-d6a81d9d04cd" (UID: "b758c36f-801e-4cb6-a477-d6a81d9d04cd"). InnerVolumeSpecName "kube-api-access-pmghn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.806300 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b758c36f-801e-4cb6-a477-d6a81d9d04cd" (UID: "b758c36f-801e-4cb6-a477-d6a81d9d04cd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.806394 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-scripts" (OuterVolumeSpecName: "scripts") pod "b758c36f-801e-4cb6-a477-d6a81d9d04cd" (UID: "b758c36f-801e-4cb6-a477-d6a81d9d04cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.822639 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b758c36f-801e-4cb6-a477-d6a81d9d04cd" (UID: "b758c36f-801e-4cb6-a477-d6a81d9d04cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.863688 4794 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.863727 4794 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.863742 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmghn\" (UniqueName: \"kubernetes.io/projected/b758c36f-801e-4cb6-a477-d6a81d9d04cd-kube-api-access-pmghn\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.863753 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.863763 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.885955 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-config-data" (OuterVolumeSpecName: "config-data") pod "b758c36f-801e-4cb6-a477-d6a81d9d04cd" (UID: "b758c36f-801e-4cb6-a477-d6a81d9d04cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.915058 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "b758c36f-801e-4cb6-a477-d6a81d9d04cd" (UID: "b758c36f-801e-4cb6-a477-d6a81d9d04cd"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.966156 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:29 crc kubenswrapper[4794]: I1215 14:19:29.966461 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b758c36f-801e-4cb6-a477-d6a81d9d04cd-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:30 crc kubenswrapper[4794]: I1215 14:19:30.307449 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-hmftj" event={"ID":"b758c36f-801e-4cb6-a477-d6a81d9d04cd","Type":"ContainerDied","Data":"67d6f18738c387ebcc6f59d3b5fa1283ed229a1140ce984bf2a0c63a0900c5e1"} Dec 15 14:19:30 crc kubenswrapper[4794]: I1215 14:19:30.308303 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67d6f18738c387ebcc6f59d3b5fa1283ed229a1140ce984bf2a0c63a0900c5e1" Dec 15 14:19:30 crc kubenswrapper[4794]: I1215 14:19:30.307563 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 14:19:30 crc kubenswrapper[4794]: I1215 14:19:30.307543 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-hmftj" Dec 15 14:19:30 crc kubenswrapper[4794]: I1215 14:19:30.399217 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:30 crc kubenswrapper[4794]: I1215 14:19:30.627842 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:32 crc kubenswrapper[4794]: I1215 14:19:32.688827 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.628238 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.645317 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.646881 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/memcached-0" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.800947 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-757bfcdb76-mnt9m"] Dec 15 14:19:35 crc kubenswrapper[4794]: E1215 14:19:35.801359 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b758c36f-801e-4cb6-a477-d6a81d9d04cd" containerName="keystone-bootstrap" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.801381 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b758c36f-801e-4cb6-a477-d6a81d9d04cd" containerName="keystone-bootstrap" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.801609 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b758c36f-801e-4cb6-a477-d6a81d9d04cd" containerName="keystone-bootstrap" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.802353 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.810270 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-757bfcdb76-mnt9m"] Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.873177 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc6t4\" (UniqueName: \"kubernetes.io/projected/9ae848ae-88e2-430f-97e4-f3a46d1c178c-kube-api-access-fc6t4\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.873266 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-fernet-keys\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.873302 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-scripts\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.873351 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-credential-keys\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.873406 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-config-data\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.873445 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-cert-memcached-mtls\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.873471 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-public-tls-certs\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.873502 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-internal-tls-certs\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.873535 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-combined-ca-bundle\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.974718 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-combined-ca-bundle\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.974793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc6t4\" (UniqueName: \"kubernetes.io/projected/9ae848ae-88e2-430f-97e4-f3a46d1c178c-kube-api-access-fc6t4\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.974848 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-fernet-keys\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.974877 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-scripts\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.974920 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-credential-keys\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.974957 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-config-data\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.974982 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-cert-memcached-mtls\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.975000 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-public-tls-certs\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.975027 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-internal-tls-certs\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.981843 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-config-data\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.982792 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-credential-keys\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.983062 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-fernet-keys\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.983290 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-combined-ca-bundle\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.983968 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-scripts\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.984269 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-public-tls-certs\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.996241 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-internal-tls-certs\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:35 crc kubenswrapper[4794]: I1215 14:19:35.997181 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ae848ae-88e2-430f-97e4-f3a46d1c178c-cert-memcached-mtls\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:36 crc kubenswrapper[4794]: I1215 14:19:36.004089 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc6t4\" (UniqueName: \"kubernetes.io/projected/9ae848ae-88e2-430f-97e4-f3a46d1c178c-kube-api-access-fc6t4\") pod \"keystone-757bfcdb76-mnt9m\" (UID: \"9ae848ae-88e2-430f-97e4-f3a46d1c178c\") " pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:36 crc kubenswrapper[4794]: I1215 14:19:36.185507 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:36 crc kubenswrapper[4794]: I1215 14:19:36.388697 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:36 crc kubenswrapper[4794]: I1215 14:19:36.513733 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:36 crc kubenswrapper[4794]: I1215 14:19:36.657884 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-757bfcdb76-mnt9m"] Dec 15 14:19:36 crc kubenswrapper[4794]: W1215 14:19:36.664780 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ae848ae_88e2_430f_97e4_f3a46d1c178c.slice/crio-ae83487444a6add170c6149ca7b0bc3633227ff9f2d805b2adfe61e163d2ebe6 WatchSource:0}: Error finding container ae83487444a6add170c6149ca7b0bc3633227ff9f2d805b2adfe61e163d2ebe6: Status 404 returned error can't find the container with id ae83487444a6add170c6149ca7b0bc3633227ff9f2d805b2adfe61e163d2ebe6 Dec 15 14:19:37 crc kubenswrapper[4794]: I1215 14:19:37.413652 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" event={"ID":"9ae848ae-88e2-430f-97e4-f3a46d1c178c","Type":"ContainerStarted","Data":"62aadd44eb1ae7cfbac29acf35cdd8aa8cec9d1b2fb534a0dc0cbf34879ff9c7"} Dec 15 14:19:37 crc kubenswrapper[4794]: I1215 14:19:37.413995 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" event={"ID":"9ae848ae-88e2-430f-97e4-f3a46d1c178c","Type":"ContainerStarted","Data":"ae83487444a6add170c6149ca7b0bc3633227ff9f2d805b2adfe61e163d2ebe6"} Dec 15 14:19:37 crc kubenswrapper[4794]: I1215 14:19:37.414015 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:19:37 crc kubenswrapper[4794]: I1215 14:19:37.462059 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" podStartSLOduration=2.462039596 podStartE2EDuration="2.462039596s" podCreationTimestamp="2025-12-15 14:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:19:37.455951574 +0000 UTC m=+1539.307974012" watchObservedRunningTime="2025-12-15 14:19:37.462039596 +0000 UTC m=+1539.314062034" Dec 15 14:19:37 crc kubenswrapper[4794]: I1215 14:19:37.688777 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:37 crc kubenswrapper[4794]: I1215 14:19:37.701051 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:37 crc kubenswrapper[4794]: I1215 14:19:37.722857 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:37 crc kubenswrapper[4794]: I1215 14:19:37.723262 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:38 crc kubenswrapper[4794]: I1215 14:19:38.422826 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:38 crc kubenswrapper[4794]: I1215 14:19:38.423090 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="bec18956-c880-4070-bf23-4456f4a38042" containerName="watcher-kuttl-api-log" containerID="cri-o://4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15" gracePeriod=30 Dec 15 14:19:38 crc kubenswrapper[4794]: I1215 14:19:38.423254 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="bec18956-c880-4070-bf23-4456f4a38042" containerName="watcher-api" containerID="cri-o://bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4" gracePeriod=30 Dec 15 14:19:38 crc kubenswrapper[4794]: I1215 14:19:38.468576 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:19:38 crc kubenswrapper[4794]: I1215 14:19:38.475279 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.287239 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.334415 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-config-data\") pod \"bec18956-c880-4070-bf23-4456f4a38042\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.334462 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-combined-ca-bundle\") pod \"bec18956-c880-4070-bf23-4456f4a38042\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.334497 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7ngh\" (UniqueName: \"kubernetes.io/projected/bec18956-c880-4070-bf23-4456f4a38042-kube-api-access-t7ngh\") pod \"bec18956-c880-4070-bf23-4456f4a38042\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.334556 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-cert-memcached-mtls\") pod \"bec18956-c880-4070-bf23-4456f4a38042\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.334640 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-public-tls-certs\") pod \"bec18956-c880-4070-bf23-4456f4a38042\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.334698 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-custom-prometheus-ca\") pod \"bec18956-c880-4070-bf23-4456f4a38042\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.334778 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bec18956-c880-4070-bf23-4456f4a38042-logs\") pod \"bec18956-c880-4070-bf23-4456f4a38042\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.334814 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-internal-tls-certs\") pod \"bec18956-c880-4070-bf23-4456f4a38042\" (UID: \"bec18956-c880-4070-bf23-4456f4a38042\") " Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.335915 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bec18956-c880-4070-bf23-4456f4a38042-logs" (OuterVolumeSpecName: "logs") pod "bec18956-c880-4070-bf23-4456f4a38042" (UID: "bec18956-c880-4070-bf23-4456f4a38042"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.343772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec18956-c880-4070-bf23-4456f4a38042-kube-api-access-t7ngh" (OuterVolumeSpecName: "kube-api-access-t7ngh") pod "bec18956-c880-4070-bf23-4456f4a38042" (UID: "bec18956-c880-4070-bf23-4456f4a38042"). InnerVolumeSpecName "kube-api-access-t7ngh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.384870 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bec18956-c880-4070-bf23-4456f4a38042" (UID: "bec18956-c880-4070-bf23-4456f4a38042"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.390789 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "bec18956-c880-4070-bf23-4456f4a38042" (UID: "bec18956-c880-4070-bf23-4456f4a38042"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.413815 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bec18956-c880-4070-bf23-4456f4a38042" (UID: "bec18956-c880-4070-bf23-4456f4a38042"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.416621 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-config-data" (OuterVolumeSpecName: "config-data") pod "bec18956-c880-4070-bf23-4456f4a38042" (UID: "bec18956-c880-4070-bf23-4456f4a38042"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.437141 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7ngh\" (UniqueName: \"kubernetes.io/projected/bec18956-c880-4070-bf23-4456f4a38042-kube-api-access-t7ngh\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.437197 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.437211 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.437223 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bec18956-c880-4070-bf23-4456f4a38042-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.437236 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.437276 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.439763 4794 generic.go:334] "Generic (PLEG): container finished" podID="bec18956-c880-4070-bf23-4456f4a38042" containerID="bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4" exitCode=0 Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.439793 4794 generic.go:334] "Generic (PLEG): container finished" podID="bec18956-c880-4070-bf23-4456f4a38042" containerID="4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15" exitCode=143 Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.439828 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.439883 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bec18956-c880-4070-bf23-4456f4a38042","Type":"ContainerDied","Data":"bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4"} Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.439930 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bec18956-c880-4070-bf23-4456f4a38042","Type":"ContainerDied","Data":"4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15"} Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.439941 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"bec18956-c880-4070-bf23-4456f4a38042","Type":"ContainerDied","Data":"b7170e82f6c25554ee74cb6927f2449703a12a5d33d1fc961509c0443e5ee199"} Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.439958 4794 scope.go:117] "RemoveContainer" containerID="bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.450430 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bec18956-c880-4070-bf23-4456f4a38042" (UID: "bec18956-c880-4070-bf23-4456f4a38042"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.462029 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "bec18956-c880-4070-bf23-4456f4a38042" (UID: "bec18956-c880-4070-bf23-4456f4a38042"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.469229 4794 scope.go:117] "RemoveContainer" containerID="4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.489890 4794 scope.go:117] "RemoveContainer" containerID="bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4" Dec 15 14:19:39 crc kubenswrapper[4794]: E1215 14:19:39.490259 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4\": container with ID starting with bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4 not found: ID does not exist" containerID="bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.490297 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4"} err="failed to get container status \"bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4\": rpc error: code = NotFound desc = could not find container \"bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4\": container with ID starting with bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4 not found: ID does not exist" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.490322 4794 scope.go:117] "RemoveContainer" containerID="4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15" Dec 15 14:19:39 crc kubenswrapper[4794]: E1215 14:19:39.490813 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15\": container with ID starting with 4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15 not found: ID does not exist" containerID="4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.490844 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15"} err="failed to get container status \"4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15\": rpc error: code = NotFound desc = could not find container \"4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15\": container with ID starting with 4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15 not found: ID does not exist" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.490862 4794 scope.go:117] "RemoveContainer" containerID="bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.491061 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4"} err="failed to get container status \"bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4\": rpc error: code = NotFound desc = could not find container \"bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4\": container with ID starting with bdac3479fc95b0e39b2f0c9d9d332b916b634867ac441f9e28b4947e79f832b4 not found: ID does not exist" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.491077 4794 scope.go:117] "RemoveContainer" containerID="4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.491298 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15"} err="failed to get container status \"4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15\": rpc error: code = NotFound desc = could not find container \"4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15\": container with ID starting with 4d56b946d2b5c2a0b3ba5207a7c63c9dfe6a7647311b002ab81c8899c564fe15 not found: ID does not exist" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.540739 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.540774 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bec18956-c880-4070-bf23-4456f4a38042-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.769335 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.778001 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.798798 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:39 crc kubenswrapper[4794]: E1215 14:19:39.799241 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec18956-c880-4070-bf23-4456f4a38042" containerName="watcher-kuttl-api-log" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.799262 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec18956-c880-4070-bf23-4456f4a38042" containerName="watcher-kuttl-api-log" Dec 15 14:19:39 crc kubenswrapper[4794]: E1215 14:19:39.799292 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec18956-c880-4070-bf23-4456f4a38042" containerName="watcher-api" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.799300 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec18956-c880-4070-bf23-4456f4a38042" containerName="watcher-api" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.799507 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec18956-c880-4070-bf23-4456f4a38042" containerName="watcher-kuttl-api-log" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.799542 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec18956-c880-4070-bf23-4456f4a38042" containerName="watcher-api" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.802314 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.816620 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.834277 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.844779 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.844819 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.844887 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.844970 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f73c00-c903-4f92-b59f-55d280fd48ac-logs\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.845062 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.845216 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srn6h\" (UniqueName: \"kubernetes.io/projected/06f73c00-c903-4f92-b59f-55d280fd48ac-kube-api-access-srn6h\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.946357 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.946398 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.946438 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.946491 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f73c00-c903-4f92-b59f-55d280fd48ac-logs\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.946543 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.946624 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srn6h\" (UniqueName: \"kubernetes.io/projected/06f73c00-c903-4f92-b59f-55d280fd48ac-kube-api-access-srn6h\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.947322 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f73c00-c903-4f92-b59f-55d280fd48ac-logs\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.950979 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.951611 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.952859 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.953293 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:39 crc kubenswrapper[4794]: I1215 14:19:39.971889 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srn6h\" (UniqueName: \"kubernetes.io/projected/06f73c00-c903-4f92-b59f-55d280fd48ac-kube-api-access-srn6h\") pod \"watcher-kuttl-api-0\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:40 crc kubenswrapper[4794]: I1215 14:19:40.166783 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:40 crc kubenswrapper[4794]: I1215 14:19:40.612340 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:19:40 crc kubenswrapper[4794]: I1215 14:19:40.747325 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec18956-c880-4070-bf23-4456f4a38042" path="/var/lib/kubelet/pods/bec18956-c880-4070-bf23-4456f4a38042/volumes" Dec 15 14:19:41 crc kubenswrapper[4794]: I1215 14:19:41.461171 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"06f73c00-c903-4f92-b59f-55d280fd48ac","Type":"ContainerStarted","Data":"28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545"} Dec 15 14:19:41 crc kubenswrapper[4794]: I1215 14:19:41.461443 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:41 crc kubenswrapper[4794]: I1215 14:19:41.461455 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"06f73c00-c903-4f92-b59f-55d280fd48ac","Type":"ContainerStarted","Data":"0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f"} Dec 15 14:19:41 crc kubenswrapper[4794]: I1215 14:19:41.461465 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"06f73c00-c903-4f92-b59f-55d280fd48ac","Type":"ContainerStarted","Data":"3ecd57e6dd460cf6ce1f1380ada7bbd5964aa155ec1c4f8837181be5118e5645"} Dec 15 14:19:41 crc kubenswrapper[4794]: I1215 14:19:41.486380 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.48635699 podStartE2EDuration="2.48635699s" podCreationTimestamp="2025-12-15 14:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:19:41.48209708 +0000 UTC m=+1543.334119528" watchObservedRunningTime="2025-12-15 14:19:41.48635699 +0000 UTC m=+1543.338379438" Dec 15 14:19:43 crc kubenswrapper[4794]: I1215 14:19:43.410494 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:19:43 crc kubenswrapper[4794]: I1215 14:19:43.738882 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:45 crc kubenswrapper[4794]: I1215 14:19:45.167933 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:50 crc kubenswrapper[4794]: I1215 14:19:50.167870 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:50 crc kubenswrapper[4794]: I1215 14:19:50.174810 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:50 crc kubenswrapper[4794]: I1215 14:19:50.544534 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:19:54 crc kubenswrapper[4794]: I1215 14:19:54.534240 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:19:54 crc kubenswrapper[4794]: I1215 14:19:54.534929 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:19:54 crc kubenswrapper[4794]: I1215 14:19:54.534997 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 14:19:54 crc kubenswrapper[4794]: I1215 14:19:54.535999 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098"} pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 15 14:19:54 crc kubenswrapper[4794]: I1215 14:19:54.536102 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" containerID="cri-o://231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" gracePeriod=600 Dec 15 14:19:56 crc kubenswrapper[4794]: I1215 14:19:56.586221 4794 generic.go:334] "Generic (PLEG): container finished" podID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" exitCode=0 Dec 15 14:19:56 crc kubenswrapper[4794]: I1215 14:19:56.586274 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerDied","Data":"231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098"} Dec 15 14:19:56 crc kubenswrapper[4794]: I1215 14:19:56.586505 4794 scope.go:117] "RemoveContainer" containerID="29f0e3d8aa7777138c2e7242f8fdf176d9b72c162c2a35610e7f08f0da28f0c2" Dec 15 14:19:57 crc kubenswrapper[4794]: E1215 14:19:57.349569 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:19:57 crc kubenswrapper[4794]: I1215 14:19:57.600842 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:19:57 crc kubenswrapper[4794]: E1215 14:19:57.602238 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.274927 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fxtxn"] Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.278713 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.294124 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxtxn"] Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.348049 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bftpx\" (UniqueName: \"kubernetes.io/projected/a174b44c-de06-4371-8d2e-bacfc77c7d94-kube-api-access-bftpx\") pod \"redhat-marketplace-fxtxn\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.348124 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-catalog-content\") pod \"redhat-marketplace-fxtxn\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.348172 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-utilities\") pod \"redhat-marketplace-fxtxn\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.449244 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bftpx\" (UniqueName: \"kubernetes.io/projected/a174b44c-de06-4371-8d2e-bacfc77c7d94-kube-api-access-bftpx\") pod \"redhat-marketplace-fxtxn\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.449290 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-catalog-content\") pod \"redhat-marketplace-fxtxn\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.449332 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-utilities\") pod \"redhat-marketplace-fxtxn\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.449875 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-utilities\") pod \"redhat-marketplace-fxtxn\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.449910 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-catalog-content\") pod \"redhat-marketplace-fxtxn\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.472291 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bftpx\" (UniqueName: \"kubernetes.io/projected/a174b44c-de06-4371-8d2e-bacfc77c7d94-kube-api-access-bftpx\") pod \"redhat-marketplace-fxtxn\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:04 crc kubenswrapper[4794]: I1215 14:20:04.600181 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:05 crc kubenswrapper[4794]: I1215 14:20:05.127181 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxtxn"] Dec 15 14:20:05 crc kubenswrapper[4794]: W1215 14:20:05.131152 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda174b44c_de06_4371_8d2e_bacfc77c7d94.slice/crio-b29b37decbf00810b6fc7799b1f92b0501d8e34a7e856fd2f882db687378517e WatchSource:0}: Error finding container b29b37decbf00810b6fc7799b1f92b0501d8e34a7e856fd2f882db687378517e: Status 404 returned error can't find the container with id b29b37decbf00810b6fc7799b1f92b0501d8e34a7e856fd2f882db687378517e Dec 15 14:20:05 crc kubenswrapper[4794]: I1215 14:20:05.674017 4794 generic.go:334] "Generic (PLEG): container finished" podID="a174b44c-de06-4371-8d2e-bacfc77c7d94" containerID="692144112a504bafbac8896aa3d333bc907da2304ab13df5e87d83751c6f2d45" exitCode=0 Dec 15 14:20:05 crc kubenswrapper[4794]: I1215 14:20:05.674078 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxtxn" event={"ID":"a174b44c-de06-4371-8d2e-bacfc77c7d94","Type":"ContainerDied","Data":"692144112a504bafbac8896aa3d333bc907da2304ab13df5e87d83751c6f2d45"} Dec 15 14:20:05 crc kubenswrapper[4794]: I1215 14:20:05.674426 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxtxn" event={"ID":"a174b44c-de06-4371-8d2e-bacfc77c7d94","Type":"ContainerStarted","Data":"b29b37decbf00810b6fc7799b1f92b0501d8e34a7e856fd2f882db687378517e"} Dec 15 14:20:07 crc kubenswrapper[4794]: I1215 14:20:07.696111 4794 generic.go:334] "Generic (PLEG): container finished" podID="a174b44c-de06-4371-8d2e-bacfc77c7d94" containerID="5339ce79d1f301f75f30dd0d466aea23afdd6cdb9ad71dbdd3826ab01547e0c3" exitCode=0 Dec 15 14:20:07 crc kubenswrapper[4794]: I1215 14:20:07.696174 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxtxn" event={"ID":"a174b44c-de06-4371-8d2e-bacfc77c7d94","Type":"ContainerDied","Data":"5339ce79d1f301f75f30dd0d466aea23afdd6cdb9ad71dbdd3826ab01547e0c3"} Dec 15 14:20:07 crc kubenswrapper[4794]: I1215 14:20:07.946652 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/keystone-757bfcdb76-mnt9m" Dec 15 14:20:08 crc kubenswrapper[4794]: I1215 14:20:08.006243 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t"] Dec 15 14:20:08 crc kubenswrapper[4794]: I1215 14:20:08.006487 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" podUID="05371ef4-81ff-4685-a82b-98d71c8bf9cb" containerName="keystone-api" containerID="cri-o://350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea" gracePeriod=30 Dec 15 14:20:08 crc kubenswrapper[4794]: I1215 14:20:08.710035 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxtxn" event={"ID":"a174b44c-de06-4371-8d2e-bacfc77c7d94","Type":"ContainerStarted","Data":"84734b81887f8adc7c424add783ecaa4f48370b02e6bcc63e03e0cd577a712e0"} Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.675561 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.693900 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fxtxn" podStartSLOduration=5.126967324 podStartE2EDuration="7.693875425s" podCreationTimestamp="2025-12-15 14:20:04 +0000 UTC" firstStartedPulling="2025-12-15 14:20:05.675797574 +0000 UTC m=+1567.527820052" lastFinishedPulling="2025-12-15 14:20:08.242705715 +0000 UTC m=+1570.094728153" observedRunningTime="2025-12-15 14:20:08.730593898 +0000 UTC m=+1570.582616326" watchObservedRunningTime="2025-12-15 14:20:11.693875425 +0000 UTC m=+1573.545897863" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.729972 4794 generic.go:334] "Generic (PLEG): container finished" podID="05371ef4-81ff-4685-a82b-98d71c8bf9cb" containerID="350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea" exitCode=0 Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.730014 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" event={"ID":"05371ef4-81ff-4685-a82b-98d71c8bf9cb","Type":"ContainerDied","Data":"350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea"} Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.730040 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" event={"ID":"05371ef4-81ff-4685-a82b-98d71c8bf9cb","Type":"ContainerDied","Data":"8bdde47970f64c73f4ef2daf6a0567d5352a2bd8c2b7e958ff73191cce280465"} Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.730055 4794 scope.go:117] "RemoveContainer" containerID="350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.730162 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.736745 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:20:11 crc kubenswrapper[4794]: E1215 14:20:11.737056 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.760109 4794 scope.go:117] "RemoveContainer" containerID="350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea" Dec 15 14:20:11 crc kubenswrapper[4794]: E1215 14:20:11.760451 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea\": container with ID starting with 350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea not found: ID does not exist" containerID="350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.760487 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea"} err="failed to get container status \"350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea\": rpc error: code = NotFound desc = could not find container \"350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea\": container with ID starting with 350a69c187f92585b0995f9fc27cdbd8696c534e2795c4d72dcb7e1ae23d01ea not found: ID does not exist" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.875150 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-combined-ca-bundle\") pod \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.875227 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-scripts\") pod \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.875249 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-public-tls-certs\") pod \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.875265 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-fernet-keys\") pod \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.875306 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-credential-keys\") pod \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.875332 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-config-data\") pod \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.875370 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-internal-tls-certs\") pod \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.875427 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrnx4\" (UniqueName: \"kubernetes.io/projected/05371ef4-81ff-4685-a82b-98d71c8bf9cb-kube-api-access-nrnx4\") pod \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\" (UID: \"05371ef4-81ff-4685-a82b-98d71c8bf9cb\") " Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.880388 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "05371ef4-81ff-4685-a82b-98d71c8bf9cb" (UID: "05371ef4-81ff-4685-a82b-98d71c8bf9cb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.880575 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "05371ef4-81ff-4685-a82b-98d71c8bf9cb" (UID: "05371ef4-81ff-4685-a82b-98d71c8bf9cb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.880614 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05371ef4-81ff-4685-a82b-98d71c8bf9cb-kube-api-access-nrnx4" (OuterVolumeSpecName: "kube-api-access-nrnx4") pod "05371ef4-81ff-4685-a82b-98d71c8bf9cb" (UID: "05371ef4-81ff-4685-a82b-98d71c8bf9cb"). InnerVolumeSpecName "kube-api-access-nrnx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.883870 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-scripts" (OuterVolumeSpecName: "scripts") pod "05371ef4-81ff-4685-a82b-98d71c8bf9cb" (UID: "05371ef4-81ff-4685-a82b-98d71c8bf9cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.910468 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05371ef4-81ff-4685-a82b-98d71c8bf9cb" (UID: "05371ef4-81ff-4685-a82b-98d71c8bf9cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.920962 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-config-data" (OuterVolumeSpecName: "config-data") pod "05371ef4-81ff-4685-a82b-98d71c8bf9cb" (UID: "05371ef4-81ff-4685-a82b-98d71c8bf9cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.922219 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "05371ef4-81ff-4685-a82b-98d71c8bf9cb" (UID: "05371ef4-81ff-4685-a82b-98d71c8bf9cb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.932309 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "05371ef4-81ff-4685-a82b-98d71c8bf9cb" (UID: "05371ef4-81ff-4685-a82b-98d71c8bf9cb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.977815 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.977850 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.977863 4794 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.977876 4794 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.977886 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.977897 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.977907 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrnx4\" (UniqueName: \"kubernetes.io/projected/05371ef4-81ff-4685-a82b-98d71c8bf9cb-kube-api-access-nrnx4\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:11 crc kubenswrapper[4794]: I1215 14:20:11.977918 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05371ef4-81ff-4685-a82b-98d71c8bf9cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:12 crc kubenswrapper[4794]: I1215 14:20:12.060186 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t"] Dec 15 14:20:12 crc kubenswrapper[4794]: I1215 14:20:12.067991 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-7dcdc8d5f6-mfh4t"] Dec 15 14:20:12 crc kubenswrapper[4794]: I1215 14:20:12.753172 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05371ef4-81ff-4685-a82b-98d71c8bf9cb" path="/var/lib/kubelet/pods/05371ef4-81ff-4685-a82b-98d71c8bf9cb/volumes" Dec 15 14:20:13 crc kubenswrapper[4794]: I1215 14:20:13.856134 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:13 crc kubenswrapper[4794]: I1215 14:20:13.856400 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="ceilometer-central-agent" containerID="cri-o://7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc" gracePeriod=30 Dec 15 14:20:13 crc kubenswrapper[4794]: I1215 14:20:13.856439 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="proxy-httpd" containerID="cri-o://ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da" gracePeriod=30 Dec 15 14:20:13 crc kubenswrapper[4794]: I1215 14:20:13.856452 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="sg-core" containerID="cri-o://cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5" gracePeriod=30 Dec 15 14:20:13 crc kubenswrapper[4794]: I1215 14:20:13.856483 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="ceilometer-notification-agent" containerID="cri-o://aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af" gracePeriod=30 Dec 15 14:20:14 crc kubenswrapper[4794]: E1215 14:20:14.345537 4794 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9bf5a4e_6555_45ef_ba95_e2e5d127d4f7.slice/crio-conmon-7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9bf5a4e_6555_45ef_ba95_e2e5d127d4f7.slice/crio-7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc.scope\": RecentStats: unable to find data in memory cache]" Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.601323 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.601372 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.651190 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.753884 4794 generic.go:334] "Generic (PLEG): container finished" podID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerID="ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da" exitCode=0 Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.753918 4794 generic.go:334] "Generic (PLEG): container finished" podID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerID="cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5" exitCode=2 Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.753926 4794 generic.go:334] "Generic (PLEG): container finished" podID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerID="7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc" exitCode=0 Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.753920 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7","Type":"ContainerDied","Data":"ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da"} Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.753961 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7","Type":"ContainerDied","Data":"cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5"} Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.753973 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7","Type":"ContainerDied","Data":"7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc"} Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.794636 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.907041 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rgc6m"] Dec 15 14:20:14 crc kubenswrapper[4794]: E1215 14:20:14.907476 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05371ef4-81ff-4685-a82b-98d71c8bf9cb" containerName="keystone-api" Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.907491 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="05371ef4-81ff-4685-a82b-98d71c8bf9cb" containerName="keystone-api" Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.907732 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="05371ef4-81ff-4685-a82b-98d71c8bf9cb" containerName="keystone-api" Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.909228 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:14 crc kubenswrapper[4794]: I1215 14:20:14.915781 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rgc6m"] Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.025594 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxkkz\" (UniqueName: \"kubernetes.io/projected/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-kube-api-access-hxkkz\") pod \"community-operators-rgc6m\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.025954 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-catalog-content\") pod \"community-operators-rgc6m\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.026086 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-utilities\") pod \"community-operators-rgc6m\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.128004 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxkkz\" (UniqueName: \"kubernetes.io/projected/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-kube-api-access-hxkkz\") pod \"community-operators-rgc6m\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.128074 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-catalog-content\") pod \"community-operators-rgc6m\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.128209 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-utilities\") pod \"community-operators-rgc6m\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.128565 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-catalog-content\") pod \"community-operators-rgc6m\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.128667 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-utilities\") pod \"community-operators-rgc6m\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.157840 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxkkz\" (UniqueName: \"kubernetes.io/projected/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-kube-api-access-hxkkz\") pod \"community-operators-rgc6m\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.231788 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.636397 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rgc6m"] Dec 15 14:20:15 crc kubenswrapper[4794]: I1215 14:20:15.761768 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgc6m" event={"ID":"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0","Type":"ContainerStarted","Data":"b8dccecb594ccc361d8a1c5d5c7cb3865704916d7f463f6d0627efc7b8396acd"} Dec 15 14:20:16 crc kubenswrapper[4794]: I1215 14:20:16.770734 4794 generic.go:334] "Generic (PLEG): container finished" podID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" containerID="f31a37b73155fe37489c024cf210083d297c8cfaf4b0f2d1c09bfdad1b9726af" exitCode=0 Dec 15 14:20:16 crc kubenswrapper[4794]: I1215 14:20:16.770832 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgc6m" event={"ID":"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0","Type":"ContainerDied","Data":"f31a37b73155fe37489c024cf210083d297c8cfaf4b0f2d1c09bfdad1b9726af"} Dec 15 14:20:17 crc kubenswrapper[4794]: I1215 14:20:17.780674 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgc6m" event={"ID":"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0","Type":"ContainerStarted","Data":"111819f93b991faf1711079b6490c48988c3f7a2bdc979a0a582e4cb59870a99"} Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.318266 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.485927 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-combined-ca-bundle\") pod \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.486037 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-run-httpd\") pod \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.486154 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-sg-core-conf-yaml\") pod \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.486233 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-ceilometer-tls-certs\") pod \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.486302 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-config-data\") pod \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.486346 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knggn\" (UniqueName: \"kubernetes.io/projected/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-kube-api-access-knggn\") pod \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.486386 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-log-httpd\") pod \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.486438 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-scripts\") pod \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\" (UID: \"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7\") " Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.489059 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" (UID: "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.489246 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" (UID: "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.495662 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-kube-api-access-knggn" (OuterVolumeSpecName: "kube-api-access-knggn") pod "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" (UID: "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7"). InnerVolumeSpecName "kube-api-access-knggn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.495743 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-scripts" (OuterVolumeSpecName: "scripts") pod "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" (UID: "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.544297 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" (UID: "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.584000 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" (UID: "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.588847 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.588889 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.588909 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.588928 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knggn\" (UniqueName: \"kubernetes.io/projected/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-kube-api-access-knggn\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.588943 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.588957 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.596090 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" (UID: "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.608642 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-config-data" (OuterVolumeSpecName: "config-data") pod "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" (UID: "f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.692357 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.692392 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.791997 4794 generic.go:334] "Generic (PLEG): container finished" podID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerID="aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af" exitCode=0 Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.792089 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.792104 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7","Type":"ContainerDied","Data":"aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af"} Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.792183 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7","Type":"ContainerDied","Data":"388b771aa4726d0c4082501d32b669418f7da05f77069b5e485eeff76e8f253e"} Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.792204 4794 scope.go:117] "RemoveContainer" containerID="ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.797441 4794 generic.go:334] "Generic (PLEG): container finished" podID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" containerID="111819f93b991faf1711079b6490c48988c3f7a2bdc979a0a582e4cb59870a99" exitCode=0 Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.797485 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgc6m" event={"ID":"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0","Type":"ContainerDied","Data":"111819f93b991faf1711079b6490c48988c3f7a2bdc979a0a582e4cb59870a99"} Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.820764 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.825651 4794 scope.go:117] "RemoveContainer" containerID="cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.842762 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.858756 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:18 crc kubenswrapper[4794]: E1215 14:20:18.859311 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="proxy-httpd" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.859324 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="proxy-httpd" Dec 15 14:20:18 crc kubenswrapper[4794]: E1215 14:20:18.859360 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="ceilometer-notification-agent" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.859367 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="ceilometer-notification-agent" Dec 15 14:20:18 crc kubenswrapper[4794]: E1215 14:20:18.859385 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="sg-core" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.859390 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="sg-core" Dec 15 14:20:18 crc kubenswrapper[4794]: E1215 14:20:18.859404 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="ceilometer-central-agent" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.859411 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="ceilometer-central-agent" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.859755 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="ceilometer-notification-agent" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.859772 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="proxy-httpd" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.859797 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="sg-core" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.859808 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" containerName="ceilometer-central-agent" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.864355 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.868623 4794 scope.go:117] "RemoveContainer" containerID="aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.873202 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.873430 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.873492 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.902896 4794 scope.go:117] "RemoveContainer" containerID="7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.908612 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.918426 4794 scope.go:117] "RemoveContainer" containerID="ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da" Dec 15 14:20:18 crc kubenswrapper[4794]: E1215 14:20:18.918821 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da\": container with ID starting with ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da not found: ID does not exist" containerID="ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.918845 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da"} err="failed to get container status \"ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da\": rpc error: code = NotFound desc = could not find container \"ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da\": container with ID starting with ac2cd67fb5d7aebc1212cd910539dcca8d814d1543c5bc06f4607bae81b179da not found: ID does not exist" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.918864 4794 scope.go:117] "RemoveContainer" containerID="cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5" Dec 15 14:20:18 crc kubenswrapper[4794]: E1215 14:20:18.919116 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5\": container with ID starting with cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5 not found: ID does not exist" containerID="cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.919135 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5"} err="failed to get container status \"cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5\": rpc error: code = NotFound desc = could not find container \"cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5\": container with ID starting with cfa1d2e5889b7cee91f2a7c985d547d3b42d0fa2909adb6076a83e9f2d4d69a5 not found: ID does not exist" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.919149 4794 scope.go:117] "RemoveContainer" containerID="aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af" Dec 15 14:20:18 crc kubenswrapper[4794]: E1215 14:20:18.919354 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af\": container with ID starting with aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af not found: ID does not exist" containerID="aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.919372 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af"} err="failed to get container status \"aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af\": rpc error: code = NotFound desc = could not find container \"aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af\": container with ID starting with aff368baf96453e13d29414ab8cb8b30fa579bad7975acb97b3f236d609726af not found: ID does not exist" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.919382 4794 scope.go:117] "RemoveContainer" containerID="7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc" Dec 15 14:20:18 crc kubenswrapper[4794]: E1215 14:20:18.919553 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc\": container with ID starting with 7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc not found: ID does not exist" containerID="7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.919568 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc"} err="failed to get container status \"7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc\": rpc error: code = NotFound desc = could not find container \"7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc\": container with ID starting with 7c3250c54b76be211aecbd71a75cf893a476b2e4063a24cce6575bc810d8a4fc not found: ID does not exist" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.999374 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-config-data\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.999424 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.999452 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-scripts\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.999593 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-log-httpd\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.999637 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.999655 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-run-httpd\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.999719 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:18 crc kubenswrapper[4794]: I1215 14:20:18.999738 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdnxj\" (UniqueName: \"kubernetes.io/projected/cffdf4db-d859-4c87-a580-1a2b595d6017-kube-api-access-zdnxj\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.100809 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-log-httpd\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.100905 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.100983 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-run-httpd\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.101459 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-run-httpd\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.101459 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-log-httpd\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.101862 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.101900 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdnxj\" (UniqueName: \"kubernetes.io/projected/cffdf4db-d859-4c87-a580-1a2b595d6017-kube-api-access-zdnxj\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.101932 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-config-data\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.101973 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.102004 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-scripts\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.106184 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.106342 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.106518 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-config-data\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.118218 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdnxj\" (UniqueName: \"kubernetes.io/projected/cffdf4db-d859-4c87-a580-1a2b595d6017-kube-api-access-zdnxj\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.118396 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.125313 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-scripts\") pod \"ceilometer-0\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.197996 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.460141 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxtxn"] Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.460852 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fxtxn" podUID="a174b44c-de06-4371-8d2e-bacfc77c7d94" containerName="registry-server" containerID="cri-o://84734b81887f8adc7c424add783ecaa4f48370b02e6bcc63e03e0cd577a712e0" gracePeriod=2 Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.672849 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.811396 4794 generic.go:334] "Generic (PLEG): container finished" podID="a174b44c-de06-4371-8d2e-bacfc77c7d94" containerID="84734b81887f8adc7c424add783ecaa4f48370b02e6bcc63e03e0cd577a712e0" exitCode=0 Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.811470 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxtxn" event={"ID":"a174b44c-de06-4371-8d2e-bacfc77c7d94","Type":"ContainerDied","Data":"84734b81887f8adc7c424add783ecaa4f48370b02e6bcc63e03e0cd577a712e0"} Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.815518 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgc6m" event={"ID":"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0","Type":"ContainerStarted","Data":"2e63b5f37c5c86c106d87a981f2ef7cf8222a7ab568e33b22b85e74d0ce83f86"} Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.817216 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cffdf4db-d859-4c87-a580-1a2b595d6017","Type":"ContainerStarted","Data":"3d3947c48a0f18a9bf11ffec26aa91f12c2a17a695f45a1b093dfe6415b398b6"} Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.840623 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rgc6m" podStartSLOduration=3.396057243 podStartE2EDuration="5.840605045s" podCreationTimestamp="2025-12-15 14:20:14 +0000 UTC" firstStartedPulling="2025-12-15 14:20:16.772774293 +0000 UTC m=+1578.624796731" lastFinishedPulling="2025-12-15 14:20:19.217322095 +0000 UTC m=+1581.069344533" observedRunningTime="2025-12-15 14:20:19.832738473 +0000 UTC m=+1581.684760931" watchObservedRunningTime="2025-12-15 14:20:19.840605045 +0000 UTC m=+1581.692627483" Dec 15 14:20:19 crc kubenswrapper[4794]: I1215 14:20:19.924068 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.013310 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-utilities\") pod \"a174b44c-de06-4371-8d2e-bacfc77c7d94\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.013384 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-catalog-content\") pod \"a174b44c-de06-4371-8d2e-bacfc77c7d94\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.013438 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bftpx\" (UniqueName: \"kubernetes.io/projected/a174b44c-de06-4371-8d2e-bacfc77c7d94-kube-api-access-bftpx\") pod \"a174b44c-de06-4371-8d2e-bacfc77c7d94\" (UID: \"a174b44c-de06-4371-8d2e-bacfc77c7d94\") " Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.014128 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-utilities" (OuterVolumeSpecName: "utilities") pod "a174b44c-de06-4371-8d2e-bacfc77c7d94" (UID: "a174b44c-de06-4371-8d2e-bacfc77c7d94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.018996 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a174b44c-de06-4371-8d2e-bacfc77c7d94-kube-api-access-bftpx" (OuterVolumeSpecName: "kube-api-access-bftpx") pod "a174b44c-de06-4371-8d2e-bacfc77c7d94" (UID: "a174b44c-de06-4371-8d2e-bacfc77c7d94"). InnerVolumeSpecName "kube-api-access-bftpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.033252 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a174b44c-de06-4371-8d2e-bacfc77c7d94" (UID: "a174b44c-de06-4371-8d2e-bacfc77c7d94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.115501 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.115534 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a174b44c-de06-4371-8d2e-bacfc77c7d94-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.115546 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bftpx\" (UniqueName: \"kubernetes.io/projected/a174b44c-de06-4371-8d2e-bacfc77c7d94-kube-api-access-bftpx\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.746807 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7" path="/var/lib/kubelet/pods/f9bf5a4e-6555-45ef-ba95-e2e5d127d4f7/volumes" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.829634 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxtxn" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.829641 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxtxn" event={"ID":"a174b44c-de06-4371-8d2e-bacfc77c7d94","Type":"ContainerDied","Data":"b29b37decbf00810b6fc7799b1f92b0501d8e34a7e856fd2f882db687378517e"} Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.829769 4794 scope.go:117] "RemoveContainer" containerID="84734b81887f8adc7c424add783ecaa4f48370b02e6bcc63e03e0cd577a712e0" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.831720 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cffdf4db-d859-4c87-a580-1a2b595d6017","Type":"ContainerStarted","Data":"1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f"} Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.849835 4794 scope.go:117] "RemoveContainer" containerID="5339ce79d1f301f75f30dd0d466aea23afdd6cdb9ad71dbdd3826ab01547e0c3" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.853300 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxtxn"] Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.869895 4794 scope.go:117] "RemoveContainer" containerID="692144112a504bafbac8896aa3d333bc907da2304ab13df5e87d83751c6f2d45" Dec 15 14:20:20 crc kubenswrapper[4794]: I1215 14:20:20.884816 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxtxn"] Dec 15 14:20:21 crc kubenswrapper[4794]: I1215 14:20:21.842090 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cffdf4db-d859-4c87-a580-1a2b595d6017","Type":"ContainerStarted","Data":"53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2"} Dec 15 14:20:22 crc kubenswrapper[4794]: I1215 14:20:22.747805 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a174b44c-de06-4371-8d2e-bacfc77c7d94" path="/var/lib/kubelet/pods/a174b44c-de06-4371-8d2e-bacfc77c7d94/volumes" Dec 15 14:20:22 crc kubenswrapper[4794]: I1215 14:20:22.853388 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cffdf4db-d859-4c87-a580-1a2b595d6017","Type":"ContainerStarted","Data":"4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1"} Dec 15 14:20:24 crc kubenswrapper[4794]: I1215 14:20:24.885006 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cffdf4db-d859-4c87-a580-1a2b595d6017","Type":"ContainerStarted","Data":"779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905"} Dec 15 14:20:24 crc kubenswrapper[4794]: I1215 14:20:24.885378 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:24 crc kubenswrapper[4794]: I1215 14:20:24.918065 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.965973832 podStartE2EDuration="6.918039943s" podCreationTimestamp="2025-12-15 14:20:18 +0000 UTC" firstStartedPulling="2025-12-15 14:20:19.672787011 +0000 UTC m=+1581.524809449" lastFinishedPulling="2025-12-15 14:20:23.624853122 +0000 UTC m=+1585.476875560" observedRunningTime="2025-12-15 14:20:24.909302656 +0000 UTC m=+1586.761325114" watchObservedRunningTime="2025-12-15 14:20:24.918039943 +0000 UTC m=+1586.770062421" Dec 15 14:20:25 crc kubenswrapper[4794]: I1215 14:20:25.232782 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:25 crc kubenswrapper[4794]: I1215 14:20:25.233222 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:25 crc kubenswrapper[4794]: I1215 14:20:25.276403 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:25 crc kubenswrapper[4794]: I1215 14:20:25.738099 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:20:25 crc kubenswrapper[4794]: E1215 14:20:25.738994 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:20:25 crc kubenswrapper[4794]: I1215 14:20:25.947473 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:27 crc kubenswrapper[4794]: I1215 14:20:27.659092 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rgc6m"] Dec 15 14:20:28 crc kubenswrapper[4794]: I1215 14:20:28.923624 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rgc6m" podUID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" containerName="registry-server" containerID="cri-o://2e63b5f37c5c86c106d87a981f2ef7cf8222a7ab568e33b22b85e74d0ce83f86" gracePeriod=2 Dec 15 14:20:29 crc kubenswrapper[4794]: I1215 14:20:29.934203 4794 generic.go:334] "Generic (PLEG): container finished" podID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" containerID="2e63b5f37c5c86c106d87a981f2ef7cf8222a7ab568e33b22b85e74d0ce83f86" exitCode=0 Dec 15 14:20:29 crc kubenswrapper[4794]: I1215 14:20:29.934280 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgc6m" event={"ID":"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0","Type":"ContainerDied","Data":"2e63b5f37c5c86c106d87a981f2ef7cf8222a7ab568e33b22b85e74d0ce83f86"} Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.509743 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.603111 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxkkz\" (UniqueName: \"kubernetes.io/projected/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-kube-api-access-hxkkz\") pod \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.603283 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-utilities\") pod \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.603311 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-catalog-content\") pod \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\" (UID: \"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0\") " Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.604448 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-utilities" (OuterVolumeSpecName: "utilities") pod "6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" (UID: "6b2494ed-9b4a-4134-baa2-2a0aca6bdef0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.608702 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-kube-api-access-hxkkz" (OuterVolumeSpecName: "kube-api-access-hxkkz") pod "6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" (UID: "6b2494ed-9b4a-4134-baa2-2a0aca6bdef0"). InnerVolumeSpecName "kube-api-access-hxkkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.659545 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" (UID: "6b2494ed-9b4a-4134-baa2-2a0aca6bdef0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.705111 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxkkz\" (UniqueName: \"kubernetes.io/projected/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-kube-api-access-hxkkz\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.705146 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.705155 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.952726 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgc6m" event={"ID":"6b2494ed-9b4a-4134-baa2-2a0aca6bdef0","Type":"ContainerDied","Data":"b8dccecb594ccc361d8a1c5d5c7cb3865704916d7f463f6d0627efc7b8396acd"} Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.953884 4794 scope.go:117] "RemoveContainer" containerID="2e63b5f37c5c86c106d87a981f2ef7cf8222a7ab568e33b22b85e74d0ce83f86" Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.953993 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgc6m" Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.981485 4794 scope.go:117] "RemoveContainer" containerID="111819f93b991faf1711079b6490c48988c3f7a2bdc979a0a582e4cb59870a99" Dec 15 14:20:30 crc kubenswrapper[4794]: I1215 14:20:30.982363 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rgc6m"] Dec 15 14:20:31 crc kubenswrapper[4794]: I1215 14:20:31.003855 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rgc6m"] Dec 15 14:20:31 crc kubenswrapper[4794]: I1215 14:20:31.006527 4794 scope.go:117] "RemoveContainer" containerID="f31a37b73155fe37489c024cf210083d297c8cfaf4b0f2d1c09bfdad1b9726af" Dec 15 14:20:32 crc kubenswrapper[4794]: I1215 14:20:32.747665 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" path="/var/lib/kubelet/pods/6b2494ed-9b4a-4134-baa2-2a0aca6bdef0/volumes" Dec 15 14:20:39 crc kubenswrapper[4794]: I1215 14:20:39.737493 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:20:39 crc kubenswrapper[4794]: E1215 14:20:39.738568 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.207863 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.609208 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm"] Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.622377 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lp9wm"] Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.728480 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.728709 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" containerName="watcher-decision-engine" containerID="cri-o://f4b8fe10c4821d2cf7ce18ea6b3d58da541216e1e74ef94543a67f19624b6733" gracePeriod=30 Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.744097 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.744320 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerName="watcher-kuttl-api-log" containerID="cri-o://0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f" gracePeriod=30 Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.744451 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerName="watcher-api" containerID="cri-o://28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545" gracePeriod=30 Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.758985 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watchercad7-account-delete-jkk9k"] Dec 15 14:20:49 crc kubenswrapper[4794]: E1215 14:20:49.759352 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" containerName="extract-utilities" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.759367 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" containerName="extract-utilities" Dec 15 14:20:49 crc kubenswrapper[4794]: E1215 14:20:49.759396 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a174b44c-de06-4371-8d2e-bacfc77c7d94" containerName="extract-content" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.759405 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a174b44c-de06-4371-8d2e-bacfc77c7d94" containerName="extract-content" Dec 15 14:20:49 crc kubenswrapper[4794]: E1215 14:20:49.759413 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a174b44c-de06-4371-8d2e-bacfc77c7d94" containerName="extract-utilities" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.759419 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a174b44c-de06-4371-8d2e-bacfc77c7d94" containerName="extract-utilities" Dec 15 14:20:49 crc kubenswrapper[4794]: E1215 14:20:49.759428 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" containerName="registry-server" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.759434 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" containerName="registry-server" Dec 15 14:20:49 crc kubenswrapper[4794]: E1215 14:20:49.759447 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" containerName="extract-content" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.759453 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" containerName="extract-content" Dec 15 14:20:49 crc kubenswrapper[4794]: E1215 14:20:49.759465 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a174b44c-de06-4371-8d2e-bacfc77c7d94" containerName="registry-server" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.759471 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a174b44c-de06-4371-8d2e-bacfc77c7d94" containerName="registry-server" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.759645 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2494ed-9b4a-4134-baa2-2a0aca6bdef0" containerName="registry-server" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.759660 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a174b44c-de06-4371-8d2e-bacfc77c7d94" containerName="registry-server" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.760181 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchercad7-account-delete-jkk9k" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.782054 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchercad7-account-delete-jkk9k"] Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.805697 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.809733 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="4bf1c253-9509-4a95-8840-c1ed039c0e9d" containerName="watcher-applier" containerID="cri-o://9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591" gracePeriod=30 Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.818480 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-jxgjt"] Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.852104 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-jxgjt"] Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.909378 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-cad7-account-create-rcmsr"] Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.944007 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vfql\" (UniqueName: \"kubernetes.io/projected/06ff9c9b-7834-4543-9919-3c20a84ecfa5-kube-api-access-7vfql\") pod \"watchercad7-account-delete-jkk9k\" (UID: \"06ff9c9b-7834-4543-9919-3c20a84ecfa5\") " pod="watcher-kuttl-default/watchercad7-account-delete-jkk9k" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.946094 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watchercad7-account-delete-jkk9k"] Dec 15 14:20:49 crc kubenswrapper[4794]: E1215 14:20:49.947127 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-7vfql], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watchercad7-account-delete-jkk9k" podUID="06ff9c9b-7834-4543-9919-3c20a84ecfa5" Dec 15 14:20:49 crc kubenswrapper[4794]: I1215 14:20:49.957769 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-cad7-account-create-rcmsr"] Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.047478 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vfql\" (UniqueName: \"kubernetes.io/projected/06ff9c9b-7834-4543-9919-3c20a84ecfa5-kube-api-access-7vfql\") pod \"watchercad7-account-delete-jkk9k\" (UID: \"06ff9c9b-7834-4543-9919-3c20a84ecfa5\") " pod="watcher-kuttl-default/watchercad7-account-delete-jkk9k" Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.076479 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vfql\" (UniqueName: \"kubernetes.io/projected/06ff9c9b-7834-4543-9919-3c20a84ecfa5-kube-api-access-7vfql\") pod \"watchercad7-account-delete-jkk9k\" (UID: \"06ff9c9b-7834-4543-9919-3c20a84ecfa5\") " pod="watcher-kuttl-default/watchercad7-account-delete-jkk9k" Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.117219 4794 generic.go:334] "Generic (PLEG): container finished" podID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerID="0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f" exitCode=143 Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.117301 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchercad7-account-delete-jkk9k" Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.117512 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"06f73c00-c903-4f92-b59f-55d280fd48ac","Type":"ContainerDied","Data":"0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f"} Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.135409 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchercad7-account-delete-jkk9k" Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.250061 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vfql\" (UniqueName: \"kubernetes.io/projected/06ff9c9b-7834-4543-9919-3c20a84ecfa5-kube-api-access-7vfql\") pod \"06ff9c9b-7834-4543-9919-3c20a84ecfa5\" (UID: \"06ff9c9b-7834-4543-9919-3c20a84ecfa5\") " Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.257337 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06ff9c9b-7834-4543-9919-3c20a84ecfa5-kube-api-access-7vfql" (OuterVolumeSpecName: "kube-api-access-7vfql") pod "06ff9c9b-7834-4543-9919-3c20a84ecfa5" (UID: "06ff9c9b-7834-4543-9919-3c20a84ecfa5"). InnerVolumeSpecName "kube-api-access-7vfql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.352366 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vfql\" (UniqueName: \"kubernetes.io/projected/06ff9c9b-7834-4543-9919-3c20a84ecfa5-kube-api-access-7vfql\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.645629 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": read tcp 10.217.0.2:56146->10.217.0.167:9322: read: connection reset by peer" Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.645684 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": read tcp 10.217.0.2:56148->10.217.0.167:9322: read: connection reset by peer" Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.747842 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="608e51f6-df42-4462-b311-16f2c33e218f" path="/var/lib/kubelet/pods/608e51f6-df42-4462-b311-16f2c33e218f/volumes" Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.748642 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d268b75-5ae2-46f1-9695-218590c87682" path="/var/lib/kubelet/pods/7d268b75-5ae2-46f1-9695-218590c87682/volumes" Dec 15 14:20:50 crc kubenswrapper[4794]: I1215 14:20:50.749152 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="985fe1f8-1958-4639-8518-80b5d9b321db" path="/var/lib/kubelet/pods/985fe1f8-1958-4639-8518-80b5d9b321db/volumes" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.102998 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.126949 4794 generic.go:334] "Generic (PLEG): container finished" podID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerID="28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545" exitCode=0 Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.127016 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchercad7-account-delete-jkk9k" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.127665 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.127817 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"06f73c00-c903-4f92-b59f-55d280fd48ac","Type":"ContainerDied","Data":"28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545"} Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.127847 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"06f73c00-c903-4f92-b59f-55d280fd48ac","Type":"ContainerDied","Data":"3ecd57e6dd460cf6ce1f1380ada7bbd5964aa155ec1c4f8837181be5118e5645"} Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.127862 4794 scope.go:117] "RemoveContainer" containerID="28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.163798 4794 scope.go:117] "RemoveContainer" containerID="0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.181626 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watchercad7-account-delete-jkk9k"] Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.186867 4794 scope.go:117] "RemoveContainer" containerID="28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545" Dec 15 14:20:51 crc kubenswrapper[4794]: E1215 14:20:51.187192 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545\": container with ID starting with 28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545 not found: ID does not exist" containerID="28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.187223 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545"} err="failed to get container status \"28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545\": rpc error: code = NotFound desc = could not find container \"28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545\": container with ID starting with 28ded69bd1ec5171e42749526abc0dec5921007368d36143d038bcd779730545 not found: ID does not exist" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.187278 4794 scope.go:117] "RemoveContainer" containerID="0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f" Dec 15 14:20:51 crc kubenswrapper[4794]: E1215 14:20:51.188784 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f\": container with ID starting with 0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f not found: ID does not exist" containerID="0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.188807 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f"} err="failed to get container status \"0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f\": rpc error: code = NotFound desc = could not find container \"0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f\": container with ID starting with 0f915c255a4d1a7df26e4d0d4b6b75b78a12292271adc2751e96c9356ca4092f not found: ID does not exist" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.193742 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watchercad7-account-delete-jkk9k"] Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.265031 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srn6h\" (UniqueName: \"kubernetes.io/projected/06f73c00-c903-4f92-b59f-55d280fd48ac-kube-api-access-srn6h\") pod \"06f73c00-c903-4f92-b59f-55d280fd48ac\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.265096 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-combined-ca-bundle\") pod \"06f73c00-c903-4f92-b59f-55d280fd48ac\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.265136 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-custom-prometheus-ca\") pod \"06f73c00-c903-4f92-b59f-55d280fd48ac\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.265159 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-cert-memcached-mtls\") pod \"06f73c00-c903-4f92-b59f-55d280fd48ac\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.265205 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f73c00-c903-4f92-b59f-55d280fd48ac-logs\") pod \"06f73c00-c903-4f92-b59f-55d280fd48ac\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.265286 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-config-data\") pod \"06f73c00-c903-4f92-b59f-55d280fd48ac\" (UID: \"06f73c00-c903-4f92-b59f-55d280fd48ac\") " Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.265994 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f73c00-c903-4f92-b59f-55d280fd48ac-logs" (OuterVolumeSpecName: "logs") pod "06f73c00-c903-4f92-b59f-55d280fd48ac" (UID: "06f73c00-c903-4f92-b59f-55d280fd48ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.272732 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f73c00-c903-4f92-b59f-55d280fd48ac-kube-api-access-srn6h" (OuterVolumeSpecName: "kube-api-access-srn6h") pod "06f73c00-c903-4f92-b59f-55d280fd48ac" (UID: "06f73c00-c903-4f92-b59f-55d280fd48ac"). InnerVolumeSpecName "kube-api-access-srn6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.287728 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06f73c00-c903-4f92-b59f-55d280fd48ac" (UID: "06f73c00-c903-4f92-b59f-55d280fd48ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.293495 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "06f73c00-c903-4f92-b59f-55d280fd48ac" (UID: "06f73c00-c903-4f92-b59f-55d280fd48ac"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.331603 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-config-data" (OuterVolumeSpecName: "config-data") pod "06f73c00-c903-4f92-b59f-55d280fd48ac" (UID: "06f73c00-c903-4f92-b59f-55d280fd48ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.354750 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "06f73c00-c903-4f92-b59f-55d280fd48ac" (UID: "06f73c00-c903-4f92-b59f-55d280fd48ac"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.366786 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.366833 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.366846 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.366858 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06f73c00-c903-4f92-b59f-55d280fd48ac-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.366872 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f73c00-c903-4f92-b59f-55d280fd48ac-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.366885 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srn6h\" (UniqueName: \"kubernetes.io/projected/06f73c00-c903-4f92-b59f-55d280fd48ac-kube-api-access-srn6h\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.461744 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:20:51 crc kubenswrapper[4794]: I1215 14:20:51.469362 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.137227 4794 generic.go:334] "Generic (PLEG): container finished" podID="4bf1c253-9509-4a95-8840-c1ed039c0e9d" containerID="9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591" exitCode=0 Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.137327 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"4bf1c253-9509-4a95-8840-c1ed039c0e9d","Type":"ContainerDied","Data":"9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591"} Dec 15 14:20:52 crc kubenswrapper[4794]: E1215 14:20:52.689154 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591 is running failed: container process not found" containerID="9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:20:52 crc kubenswrapper[4794]: E1215 14:20:52.689675 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591 is running failed: container process not found" containerID="9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:20:52 crc kubenswrapper[4794]: E1215 14:20:52.690128 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591 is running failed: container process not found" containerID="9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:20:52 crc kubenswrapper[4794]: E1215 14:20:52.690156 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591 is running failed: container process not found" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="4bf1c253-9509-4a95-8840-c1ed039c0e9d" containerName="watcher-applier" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.733357 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.748717 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06f73c00-c903-4f92-b59f-55d280fd48ac" path="/var/lib/kubelet/pods/06f73c00-c903-4f92-b59f-55d280fd48ac/volumes" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.749291 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06ff9c9b-7834-4543-9919-3c20a84ecfa5" path="/var/lib/kubelet/pods/06ff9c9b-7834-4543-9919-3c20a84ecfa5/volumes" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.795036 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.795278 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="ceilometer-central-agent" containerID="cri-o://1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f" gracePeriod=30 Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.795630 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="proxy-httpd" containerID="cri-o://779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905" gracePeriod=30 Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.795673 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="sg-core" containerID="cri-o://4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1" gracePeriod=30 Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.795703 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="ceilometer-notification-agent" containerID="cri-o://53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2" gracePeriod=30 Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.891652 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-cert-memcached-mtls\") pod \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.891740 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-config-data\") pod \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.891780 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bf1c253-9509-4a95-8840-c1ed039c0e9d-logs\") pod \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.891836 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9z6m\" (UniqueName: \"kubernetes.io/projected/4bf1c253-9509-4a95-8840-c1ed039c0e9d-kube-api-access-t9z6m\") pod \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.891886 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-combined-ca-bundle\") pod \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\" (UID: \"4bf1c253-9509-4a95-8840-c1ed039c0e9d\") " Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.892215 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bf1c253-9509-4a95-8840-c1ed039c0e9d-logs" (OuterVolumeSpecName: "logs") pod "4bf1c253-9509-4a95-8840-c1ed039c0e9d" (UID: "4bf1c253-9509-4a95-8840-c1ed039c0e9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.892322 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bf1c253-9509-4a95-8840-c1ed039c0e9d-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.909932 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bf1c253-9509-4a95-8840-c1ed039c0e9d-kube-api-access-t9z6m" (OuterVolumeSpecName: "kube-api-access-t9z6m") pod "4bf1c253-9509-4a95-8840-c1ed039c0e9d" (UID: "4bf1c253-9509-4a95-8840-c1ed039c0e9d"). InnerVolumeSpecName "kube-api-access-t9z6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.916796 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bf1c253-9509-4a95-8840-c1ed039c0e9d" (UID: "4bf1c253-9509-4a95-8840-c1ed039c0e9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.957941 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-config-data" (OuterVolumeSpecName: "config-data") pod "4bf1c253-9509-4a95-8840-c1ed039c0e9d" (UID: "4bf1c253-9509-4a95-8840-c1ed039c0e9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.959508 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "4bf1c253-9509-4a95-8840-c1ed039c0e9d" (UID: "4bf1c253-9509-4a95-8840-c1ed039c0e9d"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.993661 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.993701 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.993715 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bf1c253-9509-4a95-8840-c1ed039c0e9d-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:52 crc kubenswrapper[4794]: I1215 14:20:52.993726 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9z6m\" (UniqueName: \"kubernetes.io/projected/4bf1c253-9509-4a95-8840-c1ed039c0e9d-kube-api-access-t9z6m\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:53 crc kubenswrapper[4794]: I1215 14:20:53.147654 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"4bf1c253-9509-4a95-8840-c1ed039c0e9d","Type":"ContainerDied","Data":"3c41b80c177bfcaaf1f05ac936077232fc6f3b53167b5ea92be4e07b75282c6d"} Dec 15 14:20:53 crc kubenswrapper[4794]: I1215 14:20:53.147702 4794 scope.go:117] "RemoveContainer" containerID="9ac76c1720cd4c8fe9833716d9d9a4ba9a8e82766614a8d90ad738ad300ee591" Dec 15 14:20:53 crc kubenswrapper[4794]: I1215 14:20:53.147789 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:20:53 crc kubenswrapper[4794]: I1215 14:20:53.155418 4794 generic.go:334] "Generic (PLEG): container finished" podID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerID="779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905" exitCode=0 Dec 15 14:20:53 crc kubenswrapper[4794]: I1215 14:20:53.155470 4794 generic.go:334] "Generic (PLEG): container finished" podID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerID="4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1" exitCode=2 Dec 15 14:20:53 crc kubenswrapper[4794]: I1215 14:20:53.155502 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cffdf4db-d859-4c87-a580-1a2b595d6017","Type":"ContainerDied","Data":"779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905"} Dec 15 14:20:53 crc kubenswrapper[4794]: I1215 14:20:53.155539 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cffdf4db-d859-4c87-a580-1a2b595d6017","Type":"ContainerDied","Data":"4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1"} Dec 15 14:20:53 crc kubenswrapper[4794]: I1215 14:20:53.185756 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:20:53 crc kubenswrapper[4794]: I1215 14:20:53.191867 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:20:53 crc kubenswrapper[4794]: I1215 14:20:53.737433 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:20:53 crc kubenswrapper[4794]: E1215 14:20:53.738848 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:20:54 crc kubenswrapper[4794]: I1215 14:20:54.170397 4794 generic.go:334] "Generic (PLEG): container finished" podID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerID="1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f" exitCode=0 Dec 15 14:20:54 crc kubenswrapper[4794]: I1215 14:20:54.170426 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cffdf4db-d859-4c87-a580-1a2b595d6017","Type":"ContainerDied","Data":"1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f"} Dec 15 14:20:54 crc kubenswrapper[4794]: I1215 14:20:54.747528 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bf1c253-9509-4a95-8840-c1ed039c0e9d" path="/var/lib/kubelet/pods/4bf1c253-9509-4a95-8840-c1ed039c0e9d/volumes" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.169863 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.212179 4794 generic.go:334] "Generic (PLEG): container finished" podID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerID="53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2" exitCode=0 Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.212228 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cffdf4db-d859-4c87-a580-1a2b595d6017","Type":"ContainerDied","Data":"53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2"} Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.212267 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cffdf4db-d859-4c87-a580-1a2b595d6017","Type":"ContainerDied","Data":"3d3947c48a0f18a9bf11ffec26aa91f12c2a17a695f45a1b093dfe6415b398b6"} Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.212284 4794 scope.go:117] "RemoveContainer" containerID="779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.212445 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.240331 4794 scope.go:117] "RemoveContainer" containerID="4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.260917 4794 scope.go:117] "RemoveContainer" containerID="53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.261742 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-combined-ca-bundle\") pod \"cffdf4db-d859-4c87-a580-1a2b595d6017\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.261856 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-config-data\") pod \"cffdf4db-d859-4c87-a580-1a2b595d6017\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.261915 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdnxj\" (UniqueName: \"kubernetes.io/projected/cffdf4db-d859-4c87-a580-1a2b595d6017-kube-api-access-zdnxj\") pod \"cffdf4db-d859-4c87-a580-1a2b595d6017\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.261938 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-ceilometer-tls-certs\") pod \"cffdf4db-d859-4c87-a580-1a2b595d6017\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.261976 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-scripts\") pod \"cffdf4db-d859-4c87-a580-1a2b595d6017\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.261993 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-log-httpd\") pod \"cffdf4db-d859-4c87-a580-1a2b595d6017\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.262019 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-sg-core-conf-yaml\") pod \"cffdf4db-d859-4c87-a580-1a2b595d6017\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.262052 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-run-httpd\") pod \"cffdf4db-d859-4c87-a580-1a2b595d6017\" (UID: \"cffdf4db-d859-4c87-a580-1a2b595d6017\") " Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.262614 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cffdf4db-d859-4c87-a580-1a2b595d6017" (UID: "cffdf4db-d859-4c87-a580-1a2b595d6017"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.263225 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cffdf4db-d859-4c87-a580-1a2b595d6017" (UID: "cffdf4db-d859-4c87-a580-1a2b595d6017"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.266896 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-scripts" (OuterVolumeSpecName: "scripts") pod "cffdf4db-d859-4c87-a580-1a2b595d6017" (UID: "cffdf4db-d859-4c87-a580-1a2b595d6017"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.271362 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cffdf4db-d859-4c87-a580-1a2b595d6017-kube-api-access-zdnxj" (OuterVolumeSpecName: "kube-api-access-zdnxj") pod "cffdf4db-d859-4c87-a580-1a2b595d6017" (UID: "cffdf4db-d859-4c87-a580-1a2b595d6017"). InnerVolumeSpecName "kube-api-access-zdnxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.284514 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cffdf4db-d859-4c87-a580-1a2b595d6017" (UID: "cffdf4db-d859-4c87-a580-1a2b595d6017"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.304800 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cffdf4db-d859-4c87-a580-1a2b595d6017" (UID: "cffdf4db-d859-4c87-a580-1a2b595d6017"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.323818 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cffdf4db-d859-4c87-a580-1a2b595d6017" (UID: "cffdf4db-d859-4c87-a580-1a2b595d6017"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.351915 4794 scope.go:117] "RemoveContainer" containerID="1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.355235 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-config-data" (OuterVolumeSpecName: "config-data") pod "cffdf4db-d859-4c87-a580-1a2b595d6017" (UID: "cffdf4db-d859-4c87-a580-1a2b595d6017"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.368147 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.368191 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.368264 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.368279 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cffdf4db-d859-4c87-a580-1a2b595d6017-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.368289 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.368300 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.368310 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdnxj\" (UniqueName: \"kubernetes.io/projected/cffdf4db-d859-4c87-a580-1a2b595d6017-kube-api-access-zdnxj\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.368321 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffdf4db-d859-4c87-a580-1a2b595d6017-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.373385 4794 scope.go:117] "RemoveContainer" containerID="779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905" Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.373884 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905\": container with ID starting with 779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905 not found: ID does not exist" containerID="779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.373934 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905"} err="failed to get container status \"779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905\": rpc error: code = NotFound desc = could not find container \"779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905\": container with ID starting with 779d152c9167edfa8817a25e89d755f0111c3622fc2eeb3dc1e695463688e905 not found: ID does not exist" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.373970 4794 scope.go:117] "RemoveContainer" containerID="4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1" Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.374248 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1\": container with ID starting with 4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1 not found: ID does not exist" containerID="4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.374293 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1"} err="failed to get container status \"4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1\": rpc error: code = NotFound desc = could not find container \"4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1\": container with ID starting with 4f8169d4c9d7d420fc5fe81bfbca93319cb491317d34743e2529bc1256c441b1 not found: ID does not exist" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.374318 4794 scope.go:117] "RemoveContainer" containerID="53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2" Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.374592 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2\": container with ID starting with 53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2 not found: ID does not exist" containerID="53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.374636 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2"} err="failed to get container status \"53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2\": rpc error: code = NotFound desc = could not find container \"53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2\": container with ID starting with 53dadbb3fecd52d0f3c73313dee6a2ee375f83a2a1d1be9f279ddafa02011ea2 not found: ID does not exist" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.374665 4794 scope.go:117] "RemoveContainer" containerID="1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f" Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.374948 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f\": container with ID starting with 1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f not found: ID does not exist" containerID="1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.374998 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f"} err="failed to get container status \"1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f\": rpc error: code = NotFound desc = could not find container \"1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f\": container with ID starting with 1e92223b8dc9f8d90629baaab8f5ccd7120cfc9f43d0cff012c317f1baedc24f not found: ID does not exist" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.547926 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.554458 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.568690 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.569077 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="ceilometer-notification-agent" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569099 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="ceilometer-notification-agent" Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.569123 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="proxy-httpd" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569131 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="proxy-httpd" Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.569144 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="sg-core" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569150 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="sg-core" Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.569167 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bf1c253-9509-4a95-8840-c1ed039c0e9d" containerName="watcher-applier" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569173 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf1c253-9509-4a95-8840-c1ed039c0e9d" containerName="watcher-applier" Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.569188 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerName="watcher-kuttl-api-log" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569194 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerName="watcher-kuttl-api-log" Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.569206 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerName="watcher-api" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569213 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerName="watcher-api" Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.569232 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="ceilometer-central-agent" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569240 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="ceilometer-central-agent" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569442 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="ceilometer-central-agent" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569458 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerName="watcher-api" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569473 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f73c00-c903-4f92-b59f-55d280fd48ac" containerName="watcher-kuttl-api-log" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569487 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="sg-core" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569501 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="ceilometer-notification-agent" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569516 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" containerName="proxy-httpd" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.569526 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bf1c253-9509-4a95-8840-c1ed039c0e9d" containerName="watcher-applier" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.571322 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.574324 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.574486 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.574737 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.579836 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.673925 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.673982 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.674008 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-run-httpd\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.674059 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.674143 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-log-httpd\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.674189 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-scripts\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.674209 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-config-data\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.674229 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6xv7\" (UniqueName: \"kubernetes.io/projected/f078acc0-81e3-454d-bc9b-9df56ffe67e2-kube-api-access-z6xv7\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.703431 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4b8fe10c4821d2cf7ce18ea6b3d58da541216e1e74ef94543a67f19624b6733" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.704903 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4b8fe10c4821d2cf7ce18ea6b3d58da541216e1e74ef94543a67f19624b6733" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.706392 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4b8fe10c4821d2cf7ce18ea6b3d58da541216e1e74ef94543a67f19624b6733" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 15 14:20:57 crc kubenswrapper[4794]: E1215 14:20:57.706530 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" containerName="watcher-decision-engine" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.775524 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-log-httpd\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.775628 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-scripts\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.775651 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-config-data\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.775674 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6xv7\" (UniqueName: \"kubernetes.io/projected/f078acc0-81e3-454d-bc9b-9df56ffe67e2-kube-api-access-z6xv7\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.775712 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.775743 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.775774 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-run-httpd\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.775838 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.776028 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-log-httpd\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.776408 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-run-httpd\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.780039 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.780435 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.780439 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.780700 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-scripts\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.782463 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-config-data\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.794945 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6xv7\" (UniqueName: \"kubernetes.io/projected/f078acc0-81e3-454d-bc9b-9df56ffe67e2-kube-api-access-z6xv7\") pod \"ceilometer-0\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:57 crc kubenswrapper[4794]: I1215 14:20:57.900233 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.227174 4794 generic.go:334] "Generic (PLEG): container finished" podID="2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" containerID="f4b8fe10c4821d2cf7ce18ea6b3d58da541216e1e74ef94543a67f19624b6733" exitCode=0 Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.227263 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c","Type":"ContainerDied","Data":"f4b8fe10c4821d2cf7ce18ea6b3d58da541216e1e74ef94543a67f19624b6733"} Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.346181 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.390422 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.487699 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-cert-memcached-mtls\") pod \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.487788 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-combined-ca-bundle\") pod \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.487847 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-custom-prometheus-ca\") pod \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.487907 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn54g\" (UniqueName: \"kubernetes.io/projected/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-kube-api-access-nn54g\") pod \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.488607 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-logs\") pod \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.488677 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-config-data\") pod \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\" (UID: \"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c\") " Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.488923 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-logs" (OuterVolumeSpecName: "logs") pod "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" (UID: "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.489120 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.493397 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-kube-api-access-nn54g" (OuterVolumeSpecName: "kube-api-access-nn54g") pod "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" (UID: "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c"). InnerVolumeSpecName "kube-api-access-nn54g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.514155 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" (UID: "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.514225 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" (UID: "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.540692 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-config-data" (OuterVolumeSpecName: "config-data") pod "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" (UID: "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.560151 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" (UID: "2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.590656 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.590706 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn54g\" (UniqueName: \"kubernetes.io/projected/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-kube-api-access-nn54g\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.590722 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.590730 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.590738 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:20:58 crc kubenswrapper[4794]: I1215 14:20:58.749762 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cffdf4db-d859-4c87-a580-1a2b595d6017" path="/var/lib/kubelet/pods/cffdf4db-d859-4c87-a580-1a2b595d6017/volumes" Dec 15 14:20:59 crc kubenswrapper[4794]: I1215 14:20:59.240166 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c","Type":"ContainerDied","Data":"c51bf9b037d99d307161fbcd297db6cd4bafe6478811a72b7abc2f13de8f9d3d"} Dec 15 14:20:59 crc kubenswrapper[4794]: I1215 14:20:59.240219 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:20:59 crc kubenswrapper[4794]: I1215 14:20:59.240495 4794 scope.go:117] "RemoveContainer" containerID="f4b8fe10c4821d2cf7ce18ea6b3d58da541216e1e74ef94543a67f19624b6733" Dec 15 14:20:59 crc kubenswrapper[4794]: I1215 14:20:59.243022 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f078acc0-81e3-454d-bc9b-9df56ffe67e2","Type":"ContainerStarted","Data":"f89da186bc142f85608885a85c81d17b57097f98850d547dac48ab9eea169d77"} Dec 15 14:20:59 crc kubenswrapper[4794]: I1215 14:20:59.243137 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f078acc0-81e3-454d-bc9b-9df56ffe67e2","Type":"ContainerStarted","Data":"0d94dd9a1561360a58a1d9a65e9048ccad6a453b45327aef9b9b6af6a5f18129"} Dec 15 14:20:59 crc kubenswrapper[4794]: I1215 14:20:59.272545 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:20:59 crc kubenswrapper[4794]: I1215 14:20:59.286955 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:21:00 crc kubenswrapper[4794]: I1215 14:21:00.254511 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f078acc0-81e3-454d-bc9b-9df56ffe67e2","Type":"ContainerStarted","Data":"d951b0c6a69389f0c4030866e300afb5908dce8cb63ab4c1ac1b60ddbeed6379"} Dec 15 14:21:00 crc kubenswrapper[4794]: I1215 14:21:00.748341 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" path="/var/lib/kubelet/pods/2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c/volumes" Dec 15 14:21:01 crc kubenswrapper[4794]: I1215 14:21:01.267682 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f078acc0-81e3-454d-bc9b-9df56ffe67e2","Type":"ContainerStarted","Data":"be8c4673395a7af0f19bc4b6aa2cc0ae56f124400136d89595abf329ffe8af33"} Dec 15 14:21:02 crc kubenswrapper[4794]: I1215 14:21:02.508215 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-bffd4"] Dec 15 14:21:02 crc kubenswrapper[4794]: E1215 14:21:02.509056 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" containerName="watcher-decision-engine" Dec 15 14:21:02 crc kubenswrapper[4794]: I1215 14:21:02.509074 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" containerName="watcher-decision-engine" Dec 15 14:21:02 crc kubenswrapper[4794]: I1215 14:21:02.509301 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff1d9d7-73eb-4d94-bcfa-6c8e72f85a0c" containerName="watcher-decision-engine" Dec 15 14:21:02 crc kubenswrapper[4794]: I1215 14:21:02.509978 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-bffd4" Dec 15 14:21:02 crc kubenswrapper[4794]: I1215 14:21:02.515534 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-bffd4"] Dec 15 14:21:02 crc kubenswrapper[4794]: I1215 14:21:02.650897 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4pm\" (UniqueName: \"kubernetes.io/projected/515cc0ff-9ae0-4219-81e4-2e36747c35e8-kube-api-access-9t4pm\") pod \"watcher-db-create-bffd4\" (UID: \"515cc0ff-9ae0-4219-81e4-2e36747c35e8\") " pod="watcher-kuttl-default/watcher-db-create-bffd4" Dec 15 14:21:02 crc kubenswrapper[4794]: I1215 14:21:02.752077 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4pm\" (UniqueName: \"kubernetes.io/projected/515cc0ff-9ae0-4219-81e4-2e36747c35e8-kube-api-access-9t4pm\") pod \"watcher-db-create-bffd4\" (UID: \"515cc0ff-9ae0-4219-81e4-2e36747c35e8\") " pod="watcher-kuttl-default/watcher-db-create-bffd4" Dec 15 14:21:02 crc kubenswrapper[4794]: I1215 14:21:02.770978 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4pm\" (UniqueName: \"kubernetes.io/projected/515cc0ff-9ae0-4219-81e4-2e36747c35e8-kube-api-access-9t4pm\") pod \"watcher-db-create-bffd4\" (UID: \"515cc0ff-9ae0-4219-81e4-2e36747c35e8\") " pod="watcher-kuttl-default/watcher-db-create-bffd4" Dec 15 14:21:02 crc kubenswrapper[4794]: I1215 14:21:02.925562 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-bffd4" Dec 15 14:21:03 crc kubenswrapper[4794]: I1215 14:21:03.286938 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f078acc0-81e3-454d-bc9b-9df56ffe67e2","Type":"ContainerStarted","Data":"061115ffb4b209020421edbed24a2d3b3ace39c3f860d5560d4b3ea4465ae2c0"} Dec 15 14:21:03 crc kubenswrapper[4794]: I1215 14:21:03.287466 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:03 crc kubenswrapper[4794]: I1215 14:21:03.316430 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.204200662 podStartE2EDuration="6.316408964s" podCreationTimestamp="2025-12-15 14:20:57 +0000 UTC" firstStartedPulling="2025-12-15 14:20:58.351925722 +0000 UTC m=+1620.203948160" lastFinishedPulling="2025-12-15 14:21:02.464134024 +0000 UTC m=+1624.316156462" observedRunningTime="2025-12-15 14:21:03.313651076 +0000 UTC m=+1625.165673544" watchObservedRunningTime="2025-12-15 14:21:03.316408964 +0000 UTC m=+1625.168431402" Dec 15 14:21:03 crc kubenswrapper[4794]: W1215 14:21:03.387678 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod515cc0ff_9ae0_4219_81e4_2e36747c35e8.slice/crio-78b9e3b461060d267b33271e013a04245ec132d909b48b68b0a885e825868b02 WatchSource:0}: Error finding container 78b9e3b461060d267b33271e013a04245ec132d909b48b68b0a885e825868b02: Status 404 returned error can't find the container with id 78b9e3b461060d267b33271e013a04245ec132d909b48b68b0a885e825868b02 Dec 15 14:21:03 crc kubenswrapper[4794]: I1215 14:21:03.396377 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-bffd4"] Dec 15 14:21:04 crc kubenswrapper[4794]: I1215 14:21:04.295396 4794 generic.go:334] "Generic (PLEG): container finished" podID="515cc0ff-9ae0-4219-81e4-2e36747c35e8" containerID="522f6336f89b5be0536f8116bd501ff2b25b37104204f5d8a1aaac54241e7f4f" exitCode=0 Dec 15 14:21:04 crc kubenswrapper[4794]: I1215 14:21:04.295525 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-bffd4" event={"ID":"515cc0ff-9ae0-4219-81e4-2e36747c35e8","Type":"ContainerDied","Data":"522f6336f89b5be0536f8116bd501ff2b25b37104204f5d8a1aaac54241e7f4f"} Dec 15 14:21:04 crc kubenswrapper[4794]: I1215 14:21:04.295782 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-bffd4" event={"ID":"515cc0ff-9ae0-4219-81e4-2e36747c35e8","Type":"ContainerStarted","Data":"78b9e3b461060d267b33271e013a04245ec132d909b48b68b0a885e825868b02"} Dec 15 14:21:05 crc kubenswrapper[4794]: I1215 14:21:05.663681 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-bffd4" Dec 15 14:21:05 crc kubenswrapper[4794]: I1215 14:21:05.737112 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:21:05 crc kubenswrapper[4794]: E1215 14:21:05.737713 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:21:05 crc kubenswrapper[4794]: I1215 14:21:05.809526 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t4pm\" (UniqueName: \"kubernetes.io/projected/515cc0ff-9ae0-4219-81e4-2e36747c35e8-kube-api-access-9t4pm\") pod \"515cc0ff-9ae0-4219-81e4-2e36747c35e8\" (UID: \"515cc0ff-9ae0-4219-81e4-2e36747c35e8\") " Dec 15 14:21:05 crc kubenswrapper[4794]: I1215 14:21:05.814841 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515cc0ff-9ae0-4219-81e4-2e36747c35e8-kube-api-access-9t4pm" (OuterVolumeSpecName: "kube-api-access-9t4pm") pod "515cc0ff-9ae0-4219-81e4-2e36747c35e8" (UID: "515cc0ff-9ae0-4219-81e4-2e36747c35e8"). InnerVolumeSpecName "kube-api-access-9t4pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:21:05 crc kubenswrapper[4794]: I1215 14:21:05.911675 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t4pm\" (UniqueName: \"kubernetes.io/projected/515cc0ff-9ae0-4219-81e4-2e36747c35e8-kube-api-access-9t4pm\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:06 crc kubenswrapper[4794]: I1215 14:21:06.311999 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-bffd4" event={"ID":"515cc0ff-9ae0-4219-81e4-2e36747c35e8","Type":"ContainerDied","Data":"78b9e3b461060d267b33271e013a04245ec132d909b48b68b0a885e825868b02"} Dec 15 14:21:06 crc kubenswrapper[4794]: I1215 14:21:06.312033 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78b9e3b461060d267b33271e013a04245ec132d909b48b68b0a885e825868b02" Dec 15 14:21:06 crc kubenswrapper[4794]: I1215 14:21:06.312058 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-bffd4" Dec 15 14:21:06 crc kubenswrapper[4794]: I1215 14:21:06.678744 4794 scope.go:117] "RemoveContainer" containerID="b1804910c08a77f0396eb77684680cd80717d3355d0650f3278a159f48afdaa9" Dec 15 14:21:06 crc kubenswrapper[4794]: I1215 14:21:06.714821 4794 scope.go:117] "RemoveContainer" containerID="8bc2791eac3f32df055f689898e397c05cb4f0af73069c0da511781cd3f9d0f7" Dec 15 14:21:12 crc kubenswrapper[4794]: I1215 14:21:12.511385 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-2cb3-account-create-mzgzx"] Dec 15 14:21:12 crc kubenswrapper[4794]: E1215 14:21:12.512226 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515cc0ff-9ae0-4219-81e4-2e36747c35e8" containerName="mariadb-database-create" Dec 15 14:21:12 crc kubenswrapper[4794]: I1215 14:21:12.512238 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="515cc0ff-9ae0-4219-81e4-2e36747c35e8" containerName="mariadb-database-create" Dec 15 14:21:12 crc kubenswrapper[4794]: I1215 14:21:12.512423 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="515cc0ff-9ae0-4219-81e4-2e36747c35e8" containerName="mariadb-database-create" Dec 15 14:21:12 crc kubenswrapper[4794]: I1215 14:21:12.513201 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2cb3-account-create-mzgzx" Dec 15 14:21:12 crc kubenswrapper[4794]: I1215 14:21:12.516513 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 15 14:21:12 crc kubenswrapper[4794]: I1215 14:21:12.520930 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-2cb3-account-create-mzgzx"] Dec 15 14:21:12 crc kubenswrapper[4794]: I1215 14:21:12.630553 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pxnt\" (UniqueName: \"kubernetes.io/projected/d5601b23-626c-4218-9a46-bf326eeeab6b-kube-api-access-5pxnt\") pod \"watcher-2cb3-account-create-mzgzx\" (UID: \"d5601b23-626c-4218-9a46-bf326eeeab6b\") " pod="watcher-kuttl-default/watcher-2cb3-account-create-mzgzx" Dec 15 14:21:12 crc kubenswrapper[4794]: I1215 14:21:12.732571 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pxnt\" (UniqueName: \"kubernetes.io/projected/d5601b23-626c-4218-9a46-bf326eeeab6b-kube-api-access-5pxnt\") pod \"watcher-2cb3-account-create-mzgzx\" (UID: \"d5601b23-626c-4218-9a46-bf326eeeab6b\") " pod="watcher-kuttl-default/watcher-2cb3-account-create-mzgzx" Dec 15 14:21:12 crc kubenswrapper[4794]: I1215 14:21:12.772179 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pxnt\" (UniqueName: \"kubernetes.io/projected/d5601b23-626c-4218-9a46-bf326eeeab6b-kube-api-access-5pxnt\") pod \"watcher-2cb3-account-create-mzgzx\" (UID: \"d5601b23-626c-4218-9a46-bf326eeeab6b\") " pod="watcher-kuttl-default/watcher-2cb3-account-create-mzgzx" Dec 15 14:21:12 crc kubenswrapper[4794]: I1215 14:21:12.834143 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2cb3-account-create-mzgzx" Dec 15 14:21:13 crc kubenswrapper[4794]: I1215 14:21:13.280552 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-2cb3-account-create-mzgzx"] Dec 15 14:21:13 crc kubenswrapper[4794]: W1215 14:21:13.301879 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5601b23_626c_4218_9a46_bf326eeeab6b.slice/crio-e7cb41f7f1fd1d88af64f2137faa479a838104b616bd1d56016c9e4efea31b5e WatchSource:0}: Error finding container e7cb41f7f1fd1d88af64f2137faa479a838104b616bd1d56016c9e4efea31b5e: Status 404 returned error can't find the container with id e7cb41f7f1fd1d88af64f2137faa479a838104b616bd1d56016c9e4efea31b5e Dec 15 14:21:13 crc kubenswrapper[4794]: I1215 14:21:13.373933 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2cb3-account-create-mzgzx" event={"ID":"d5601b23-626c-4218-9a46-bf326eeeab6b","Type":"ContainerStarted","Data":"e7cb41f7f1fd1d88af64f2137faa479a838104b616bd1d56016c9e4efea31b5e"} Dec 15 14:21:14 crc kubenswrapper[4794]: I1215 14:21:14.383293 4794 generic.go:334] "Generic (PLEG): container finished" podID="d5601b23-626c-4218-9a46-bf326eeeab6b" containerID="42a7538d19bb3a907b923d9c0f44f31ca268bcbc2a10a3ba0698e991113b53c5" exitCode=0 Dec 15 14:21:14 crc kubenswrapper[4794]: I1215 14:21:14.383349 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2cb3-account-create-mzgzx" event={"ID":"d5601b23-626c-4218-9a46-bf326eeeab6b","Type":"ContainerDied","Data":"42a7538d19bb3a907b923d9c0f44f31ca268bcbc2a10a3ba0698e991113b53c5"} Dec 15 14:21:15 crc kubenswrapper[4794]: I1215 14:21:15.743357 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2cb3-account-create-mzgzx" Dec 15 14:21:15 crc kubenswrapper[4794]: I1215 14:21:15.884328 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pxnt\" (UniqueName: \"kubernetes.io/projected/d5601b23-626c-4218-9a46-bf326eeeab6b-kube-api-access-5pxnt\") pod \"d5601b23-626c-4218-9a46-bf326eeeab6b\" (UID: \"d5601b23-626c-4218-9a46-bf326eeeab6b\") " Dec 15 14:21:15 crc kubenswrapper[4794]: I1215 14:21:15.888990 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5601b23-626c-4218-9a46-bf326eeeab6b-kube-api-access-5pxnt" (OuterVolumeSpecName: "kube-api-access-5pxnt") pod "d5601b23-626c-4218-9a46-bf326eeeab6b" (UID: "d5601b23-626c-4218-9a46-bf326eeeab6b"). InnerVolumeSpecName "kube-api-access-5pxnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:21:15 crc kubenswrapper[4794]: I1215 14:21:15.994475 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pxnt\" (UniqueName: \"kubernetes.io/projected/d5601b23-626c-4218-9a46-bf326eeeab6b-kube-api-access-5pxnt\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:16 crc kubenswrapper[4794]: I1215 14:21:16.398573 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-2cb3-account-create-mzgzx" event={"ID":"d5601b23-626c-4218-9a46-bf326eeeab6b","Type":"ContainerDied","Data":"e7cb41f7f1fd1d88af64f2137faa479a838104b616bd1d56016c9e4efea31b5e"} Dec 15 14:21:16 crc kubenswrapper[4794]: I1215 14:21:16.398963 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7cb41f7f1fd1d88af64f2137faa479a838104b616bd1d56016c9e4efea31b5e" Dec 15 14:21:16 crc kubenswrapper[4794]: I1215 14:21:16.398648 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-2cb3-account-create-mzgzx" Dec 15 14:21:16 crc kubenswrapper[4794]: I1215 14:21:16.737931 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:21:16 crc kubenswrapper[4794]: E1215 14:21:16.738453 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:21:17 crc kubenswrapper[4794]: I1215 14:21:17.836806 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zblqx"] Dec 15 14:21:17 crc kubenswrapper[4794]: E1215 14:21:17.837205 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5601b23-626c-4218-9a46-bf326eeeab6b" containerName="mariadb-account-create" Dec 15 14:21:17 crc kubenswrapper[4794]: I1215 14:21:17.837219 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5601b23-626c-4218-9a46-bf326eeeab6b" containerName="mariadb-account-create" Dec 15 14:21:17 crc kubenswrapper[4794]: I1215 14:21:17.837437 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5601b23-626c-4218-9a46-bf326eeeab6b" containerName="mariadb-account-create" Dec 15 14:21:17 crc kubenswrapper[4794]: I1215 14:21:17.838187 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:17 crc kubenswrapper[4794]: I1215 14:21:17.841090 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 15 14:21:17 crc kubenswrapper[4794]: I1215 14:21:17.841310 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-46kqz" Dec 15 14:21:17 crc kubenswrapper[4794]: I1215 14:21:17.846342 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zblqx"] Dec 15 14:21:17 crc kubenswrapper[4794]: I1215 14:21:17.928787 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:17 crc kubenswrapper[4794]: I1215 14:21:17.929214 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:17 crc kubenswrapper[4794]: I1215 14:21:17.929279 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dgpg\" (UniqueName: \"kubernetes.io/projected/c3075f03-ed4f-4c07-9045-cc157ff544f3-kube-api-access-7dgpg\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:17 crc kubenswrapper[4794]: I1215 14:21:17.929463 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-config-data\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:18 crc kubenswrapper[4794]: I1215 14:21:18.030769 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-config-data\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:18 crc kubenswrapper[4794]: I1215 14:21:18.030840 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:18 crc kubenswrapper[4794]: I1215 14:21:18.030897 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:18 crc kubenswrapper[4794]: I1215 14:21:18.030940 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dgpg\" (UniqueName: \"kubernetes.io/projected/c3075f03-ed4f-4c07-9045-cc157ff544f3-kube-api-access-7dgpg\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:18 crc kubenswrapper[4794]: I1215 14:21:18.040266 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-config-data\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:18 crc kubenswrapper[4794]: I1215 14:21:18.040439 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-db-sync-config-data\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:18 crc kubenswrapper[4794]: I1215 14:21:18.045606 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:18 crc kubenswrapper[4794]: I1215 14:21:18.050224 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dgpg\" (UniqueName: \"kubernetes.io/projected/c3075f03-ed4f-4c07-9045-cc157ff544f3-kube-api-access-7dgpg\") pod \"watcher-kuttl-db-sync-zblqx\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:18 crc kubenswrapper[4794]: I1215 14:21:18.218448 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:18 crc kubenswrapper[4794]: I1215 14:21:18.723134 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zblqx"] Dec 15 14:21:19 crc kubenswrapper[4794]: I1215 14:21:19.421450 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" event={"ID":"c3075f03-ed4f-4c07-9045-cc157ff544f3","Type":"ContainerStarted","Data":"cf96fafd5d3cba47b78d81de48266458c9434a141df497ba5d4d793783b51ff6"} Dec 15 14:21:19 crc kubenswrapper[4794]: I1215 14:21:19.421878 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" event={"ID":"c3075f03-ed4f-4c07-9045-cc157ff544f3","Type":"ContainerStarted","Data":"fb3cc8b9a0f419ae1943f5bd32f99274f65679f227ab37edb476d7954b1736d6"} Dec 15 14:21:19 crc kubenswrapper[4794]: I1215 14:21:19.439184 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" podStartSLOduration=2.439162736 podStartE2EDuration="2.439162736s" podCreationTimestamp="2025-12-15 14:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:21:19.435340358 +0000 UTC m=+1641.287362816" watchObservedRunningTime="2025-12-15 14:21:19.439162736 +0000 UTC m=+1641.291185184" Dec 15 14:21:21 crc kubenswrapper[4794]: I1215 14:21:21.443199 4794 generic.go:334] "Generic (PLEG): container finished" podID="c3075f03-ed4f-4c07-9045-cc157ff544f3" containerID="cf96fafd5d3cba47b78d81de48266458c9434a141df497ba5d4d793783b51ff6" exitCode=0 Dec 15 14:21:21 crc kubenswrapper[4794]: I1215 14:21:21.443320 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" event={"ID":"c3075f03-ed4f-4c07-9045-cc157ff544f3","Type":"ContainerDied","Data":"cf96fafd5d3cba47b78d81de48266458c9434a141df497ba5d4d793783b51ff6"} Dec 15 14:21:22 crc kubenswrapper[4794]: I1215 14:21:22.786001 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:22 crc kubenswrapper[4794]: I1215 14:21:22.929987 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-config-data\") pod \"c3075f03-ed4f-4c07-9045-cc157ff544f3\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " Dec 15 14:21:22 crc kubenswrapper[4794]: I1215 14:21:22.930031 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-combined-ca-bundle\") pod \"c3075f03-ed4f-4c07-9045-cc157ff544f3\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " Dec 15 14:21:22 crc kubenswrapper[4794]: I1215 14:21:22.930075 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-db-sync-config-data\") pod \"c3075f03-ed4f-4c07-9045-cc157ff544f3\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " Dec 15 14:21:22 crc kubenswrapper[4794]: I1215 14:21:22.930171 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dgpg\" (UniqueName: \"kubernetes.io/projected/c3075f03-ed4f-4c07-9045-cc157ff544f3-kube-api-access-7dgpg\") pod \"c3075f03-ed4f-4c07-9045-cc157ff544f3\" (UID: \"c3075f03-ed4f-4c07-9045-cc157ff544f3\") " Dec 15 14:21:22 crc kubenswrapper[4794]: I1215 14:21:22.937018 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3075f03-ed4f-4c07-9045-cc157ff544f3-kube-api-access-7dgpg" (OuterVolumeSpecName: "kube-api-access-7dgpg") pod "c3075f03-ed4f-4c07-9045-cc157ff544f3" (UID: "c3075f03-ed4f-4c07-9045-cc157ff544f3"). InnerVolumeSpecName "kube-api-access-7dgpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:21:22 crc kubenswrapper[4794]: I1215 14:21:22.952462 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c3075f03-ed4f-4c07-9045-cc157ff544f3" (UID: "c3075f03-ed4f-4c07-9045-cc157ff544f3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:21:22 crc kubenswrapper[4794]: I1215 14:21:22.955078 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3075f03-ed4f-4c07-9045-cc157ff544f3" (UID: "c3075f03-ed4f-4c07-9045-cc157ff544f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:21:22 crc kubenswrapper[4794]: I1215 14:21:22.973705 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-config-data" (OuterVolumeSpecName: "config-data") pod "c3075f03-ed4f-4c07-9045-cc157ff544f3" (UID: "c3075f03-ed4f-4c07-9045-cc157ff544f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.031950 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dgpg\" (UniqueName: \"kubernetes.io/projected/c3075f03-ed4f-4c07-9045-cc157ff544f3-kube-api-access-7dgpg\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.032004 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.032022 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.032036 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3075f03-ed4f-4c07-9045-cc157ff544f3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.459801 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" event={"ID":"c3075f03-ed4f-4c07-9045-cc157ff544f3","Type":"ContainerDied","Data":"fb3cc8b9a0f419ae1943f5bd32f99274f65679f227ab37edb476d7954b1736d6"} Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.459838 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb3cc8b9a0f419ae1943f5bd32f99274f65679f227ab37edb476d7954b1736d6" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.459881 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-zblqx" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.733223 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:21:23 crc kubenswrapper[4794]: E1215 14:21:23.734015 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3075f03-ed4f-4c07-9045-cc157ff544f3" containerName="watcher-kuttl-db-sync" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.734045 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3075f03-ed4f-4c07-9045-cc157ff544f3" containerName="watcher-kuttl-db-sync" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.734274 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3075f03-ed4f-4c07-9045-cc157ff544f3" containerName="watcher-kuttl-db-sync" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.735802 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.738497 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-46kqz" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.738510 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.755058 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.756994 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.759506 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.762297 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.775619 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.811976 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.849519 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.849718 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.849820 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbhqb\" (UniqueName: \"kubernetes.io/projected/32c71f2d-c68f-4021-8a30-2ea58da786c5-kube-api-access-lbhqb\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.850003 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c71f2d-c68f-4021-8a30-2ea58da786c5-logs\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.850081 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.850196 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.884888 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.893141 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.900119 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.901476 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.906033 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.917985 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951150 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-logs\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951200 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951231 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c71f2d-c68f-4021-8a30-2ea58da786c5-logs\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951263 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951295 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951322 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951347 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951364 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951382 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951398 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3b646ec-0aff-43e4-97b2-991ca571ff83-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951419 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951436 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx7dl\" (UniqueName: \"kubernetes.io/projected/f3b646ec-0aff-43e4-97b2-991ca571ff83-kube-api-access-mx7dl\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951454 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951472 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951497 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbhqb\" (UniqueName: \"kubernetes.io/projected/32c71f2d-c68f-4021-8a30-2ea58da786c5-kube-api-access-lbhqb\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951528 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.951560 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvlr5\" (UniqueName: \"kubernetes.io/projected/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-kube-api-access-kvlr5\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.954027 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c71f2d-c68f-4021-8a30-2ea58da786c5-logs\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.956659 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.956687 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.956839 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.957146 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:23 crc kubenswrapper[4794]: I1215 14:21:23.970933 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbhqb\" (UniqueName: \"kubernetes.io/projected/32c71f2d-c68f-4021-8a30-2ea58da786c5-kube-api-access-lbhqb\") pod \"watcher-kuttl-api-0\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.053258 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.053620 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3b646ec-0aff-43e4-97b2-991ca571ff83-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.053649 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx7dl\" (UniqueName: \"kubernetes.io/projected/f3b646ec-0aff-43e4-97b2-991ca571ff83-kube-api-access-mx7dl\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.053670 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.053690 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.053726 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.053764 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.053783 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.054010 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.054057 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvlr5\" (UniqueName: \"kubernetes.io/projected/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-kube-api-access-kvlr5\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.054082 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.054172 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzhnn\" (UniqueName: \"kubernetes.io/projected/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-kube-api-access-rzhnn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.054215 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-logs\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.054250 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.054292 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.054337 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.054359 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.054355 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3b646ec-0aff-43e4-97b2-991ca571ff83-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.054783 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-logs\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.067333 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.067357 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.067333 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.067702 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.067850 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.067869 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.068450 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.071878 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx7dl\" (UniqueName: \"kubernetes.io/projected/f3b646ec-0aff-43e4-97b2-991ca571ff83-kube-api-access-mx7dl\") pod \"watcher-kuttl-applier-0\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.076893 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvlr5\" (UniqueName: \"kubernetes.io/projected/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-kube-api-access-kvlr5\") pod \"watcher-kuttl-api-1\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.116425 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.155357 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.155437 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.155493 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.155528 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.155971 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.156009 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzhnn\" (UniqueName: \"kubernetes.io/projected/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-kube-api-access-rzhnn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.155904 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.159148 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.160658 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.160947 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.161612 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.177242 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.181989 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.186921 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzhnn\" (UniqueName: \"kubernetes.io/projected/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-kube-api-access-rzhnn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.233460 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.605955 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.688112 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.773634 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:21:24 crc kubenswrapper[4794]: W1215 14:21:24.774392 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdbadb79_adaf_4e3b_a3a4_ed5b92a18dc0.slice/crio-345caece2e6ed05e9126c7d859ef39a6c388fd5a50f854fbbf95abfda9cdcb76 WatchSource:0}: Error finding container 345caece2e6ed05e9126c7d859ef39a6c388fd5a50f854fbbf95abfda9cdcb76: Status 404 returned error can't find the container with id 345caece2e6ed05e9126c7d859ef39a6c388fd5a50f854fbbf95abfda9cdcb76 Dec 15 14:21:24 crc kubenswrapper[4794]: W1215 14:21:24.777778 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d0b86f5_c4a7_4f93_9cc9_1e544cf60e6a.slice/crio-ad31122ee09d9147b422985eb33efa877aa9fabefb194845df503f847c33904d WatchSource:0}: Error finding container ad31122ee09d9147b422985eb33efa877aa9fabefb194845df503f847c33904d: Status 404 returned error can't find the container with id ad31122ee09d9147b422985eb33efa877aa9fabefb194845df503f847c33904d Dec 15 14:21:24 crc kubenswrapper[4794]: I1215 14:21:24.779426 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:21:25 crc kubenswrapper[4794]: I1215 14:21:25.491901 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"32c71f2d-c68f-4021-8a30-2ea58da786c5","Type":"ContainerStarted","Data":"772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664"} Dec 15 14:21:25 crc kubenswrapper[4794]: I1215 14:21:25.492285 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"32c71f2d-c68f-4021-8a30-2ea58da786c5","Type":"ContainerStarted","Data":"7ba554d7e431d124db174289cf0411a2f96505a725d8d124866338a9beeff6f1"} Dec 15 14:21:25 crc kubenswrapper[4794]: I1215 14:21:25.497022 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0","Type":"ContainerStarted","Data":"345caece2e6ed05e9126c7d859ef39a6c388fd5a50f854fbbf95abfda9cdcb76"} Dec 15 14:21:25 crc kubenswrapper[4794]: I1215 14:21:25.503661 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"f3b646ec-0aff-43e4-97b2-991ca571ff83","Type":"ContainerStarted","Data":"95cac5963ffcb236b12be762058907ecdc7ffb281c371fa78894f9c17044cbf3"} Dec 15 14:21:25 crc kubenswrapper[4794]: I1215 14:21:25.503707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"f3b646ec-0aff-43e4-97b2-991ca571ff83","Type":"ContainerStarted","Data":"ecd213ac3f79e72278ded14723ff0c2da865e158aaa232c2b253f8e262a8e5d0"} Dec 15 14:21:25 crc kubenswrapper[4794]: I1215 14:21:25.505446 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a","Type":"ContainerStarted","Data":"ad31122ee09d9147b422985eb33efa877aa9fabefb194845df503f847c33904d"} Dec 15 14:21:26 crc kubenswrapper[4794]: I1215 14:21:26.517559 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"32c71f2d-c68f-4021-8a30-2ea58da786c5","Type":"ContainerStarted","Data":"18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6"} Dec 15 14:21:26 crc kubenswrapper[4794]: I1215 14:21:26.518488 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:26 crc kubenswrapper[4794]: I1215 14:21:26.519931 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0","Type":"ContainerStarted","Data":"e6e1264948080ec083ed47e7d3dbae0ec674264e29171647a325e8b51d8bb345"} Dec 15 14:21:26 crc kubenswrapper[4794]: I1215 14:21:26.520073 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0","Type":"ContainerStarted","Data":"6c2891f7cdbbb90d825577bdc72819914f0867412d9569d5960257ea69a4870c"} Dec 15 14:21:26 crc kubenswrapper[4794]: I1215 14:21:26.520423 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:26 crc kubenswrapper[4794]: I1215 14:21:26.521601 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a","Type":"ContainerStarted","Data":"46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881"} Dec 15 14:21:26 crc kubenswrapper[4794]: I1215 14:21:26.540266 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=3.540247689 podStartE2EDuration="3.540247689s" podCreationTimestamp="2025-12-15 14:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:21:26.538769207 +0000 UTC m=+1648.390791645" watchObservedRunningTime="2025-12-15 14:21:26.540247689 +0000 UTC m=+1648.392270127" Dec 15 14:21:26 crc kubenswrapper[4794]: I1215 14:21:26.564048 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=3.564027492 podStartE2EDuration="3.564027492s" podCreationTimestamp="2025-12-15 14:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:21:26.557479337 +0000 UTC m=+1648.409501795" watchObservedRunningTime="2025-12-15 14:21:26.564027492 +0000 UTC m=+1648.416049940" Dec 15 14:21:26 crc kubenswrapper[4794]: I1215 14:21:26.582507 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-1" podStartSLOduration=3.582485634 podStartE2EDuration="3.582485634s" podCreationTimestamp="2025-12-15 14:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:21:26.574716575 +0000 UTC m=+1648.426739023" watchObservedRunningTime="2025-12-15 14:21:26.582485634 +0000 UTC m=+1648.434508092" Dec 15 14:21:26 crc kubenswrapper[4794]: I1215 14:21:26.590962 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=3.590943804 podStartE2EDuration="3.590943804s" podCreationTimestamp="2025-12-15 14:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:21:26.588911666 +0000 UTC m=+1648.440934124" watchObservedRunningTime="2025-12-15 14:21:26.590943804 +0000 UTC m=+1648.442966262" Dec 15 14:21:27 crc kubenswrapper[4794]: I1215 14:21:27.907572 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:28 crc kubenswrapper[4794]: I1215 14:21:28.537516 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 14:21:28 crc kubenswrapper[4794]: I1215 14:21:28.665743 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:28 crc kubenswrapper[4794]: I1215 14:21:28.692874 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:28 crc kubenswrapper[4794]: I1215 14:21:28.743246 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:21:28 crc kubenswrapper[4794]: E1215 14:21:28.743524 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:21:29 crc kubenswrapper[4794]: I1215 14:21:29.116753 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:29 crc kubenswrapper[4794]: I1215 14:21:29.159653 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:29 crc kubenswrapper[4794]: I1215 14:21:29.182830 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.117495 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.123441 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.160314 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.183115 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.195726 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.204135 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.234392 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.257149 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.579934 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.585396 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.589003 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.601878 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:21:34 crc kubenswrapper[4794]: I1215 14:21:34.608551 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:21:36 crc kubenswrapper[4794]: I1215 14:21:36.565099 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:21:36 crc kubenswrapper[4794]: I1215 14:21:36.565731 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="ceilometer-central-agent" containerID="cri-o://f89da186bc142f85608885a85c81d17b57097f98850d547dac48ab9eea169d77" gracePeriod=30 Dec 15 14:21:36 crc kubenswrapper[4794]: I1215 14:21:36.565774 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="sg-core" containerID="cri-o://be8c4673395a7af0f19bc4b6aa2cc0ae56f124400136d89595abf329ffe8af33" gracePeriod=30 Dec 15 14:21:36 crc kubenswrapper[4794]: I1215 14:21:36.565799 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="proxy-httpd" containerID="cri-o://061115ffb4b209020421edbed24a2d3b3ace39c3f860d5560d4b3ea4465ae2c0" gracePeriod=30 Dec 15 14:21:36 crc kubenswrapper[4794]: I1215 14:21:36.565839 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="ceilometer-notification-agent" containerID="cri-o://d951b0c6a69389f0c4030866e300afb5908dce8cb63ab4c1ac1b60ddbeed6379" gracePeriod=30 Dec 15 14:21:37 crc kubenswrapper[4794]: I1215 14:21:37.611043 4794 generic.go:334] "Generic (PLEG): container finished" podID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerID="061115ffb4b209020421edbed24a2d3b3ace39c3f860d5560d4b3ea4465ae2c0" exitCode=0 Dec 15 14:21:37 crc kubenswrapper[4794]: I1215 14:21:37.611337 4794 generic.go:334] "Generic (PLEG): container finished" podID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerID="be8c4673395a7af0f19bc4b6aa2cc0ae56f124400136d89595abf329ffe8af33" exitCode=2 Dec 15 14:21:37 crc kubenswrapper[4794]: I1215 14:21:37.611349 4794 generic.go:334] "Generic (PLEG): container finished" podID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerID="f89da186bc142f85608885a85c81d17b57097f98850d547dac48ab9eea169d77" exitCode=0 Dec 15 14:21:37 crc kubenswrapper[4794]: I1215 14:21:37.611122 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f078acc0-81e3-454d-bc9b-9df56ffe67e2","Type":"ContainerDied","Data":"061115ffb4b209020421edbed24a2d3b3ace39c3f860d5560d4b3ea4465ae2c0"} Dec 15 14:21:37 crc kubenswrapper[4794]: I1215 14:21:37.611381 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f078acc0-81e3-454d-bc9b-9df56ffe67e2","Type":"ContainerDied","Data":"be8c4673395a7af0f19bc4b6aa2cc0ae56f124400136d89595abf329ffe8af33"} Dec 15 14:21:37 crc kubenswrapper[4794]: I1215 14:21:37.611396 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f078acc0-81e3-454d-bc9b-9df56ffe67e2","Type":"ContainerDied","Data":"f89da186bc142f85608885a85c81d17b57097f98850d547dac48ab9eea169d77"} Dec 15 14:21:41 crc kubenswrapper[4794]: I1215 14:21:41.736915 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:21:41 crc kubenswrapper[4794]: E1215 14:21:41.737535 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:21:43 crc kubenswrapper[4794]: I1215 14:21:43.668789 4794 generic.go:334] "Generic (PLEG): container finished" podID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerID="d951b0c6a69389f0c4030866e300afb5908dce8cb63ab4c1ac1b60ddbeed6379" exitCode=0 Dec 15 14:21:43 crc kubenswrapper[4794]: I1215 14:21:43.668861 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f078acc0-81e3-454d-bc9b-9df56ffe67e2","Type":"ContainerDied","Data":"d951b0c6a69389f0c4030866e300afb5908dce8cb63ab4c1ac1b60ddbeed6379"} Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.384543 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.431097 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-combined-ca-bundle\") pod \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.431170 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-ceilometer-tls-certs\") pod \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.431239 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6xv7\" (UniqueName: \"kubernetes.io/projected/f078acc0-81e3-454d-bc9b-9df56ffe67e2-kube-api-access-z6xv7\") pod \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.431268 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-run-httpd\") pod \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.431345 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-scripts\") pod \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.431413 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-sg-core-conf-yaml\") pod \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.431443 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-config-data\") pod \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.431483 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-log-httpd\") pod \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\" (UID: \"f078acc0-81e3-454d-bc9b-9df56ffe67e2\") " Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.431927 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f078acc0-81e3-454d-bc9b-9df56ffe67e2" (UID: "f078acc0-81e3-454d-bc9b-9df56ffe67e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.432266 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f078acc0-81e3-454d-bc9b-9df56ffe67e2" (UID: "f078acc0-81e3-454d-bc9b-9df56ffe67e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.450068 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-scripts" (OuterVolumeSpecName: "scripts") pod "f078acc0-81e3-454d-bc9b-9df56ffe67e2" (UID: "f078acc0-81e3-454d-bc9b-9df56ffe67e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.450904 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f078acc0-81e3-454d-bc9b-9df56ffe67e2-kube-api-access-z6xv7" (OuterVolumeSpecName: "kube-api-access-z6xv7") pod "f078acc0-81e3-454d-bc9b-9df56ffe67e2" (UID: "f078acc0-81e3-454d-bc9b-9df56ffe67e2"). InnerVolumeSpecName "kube-api-access-z6xv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.493721 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f078acc0-81e3-454d-bc9b-9df56ffe67e2" (UID: "f078acc0-81e3-454d-bc9b-9df56ffe67e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.517151 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f078acc0-81e3-454d-bc9b-9df56ffe67e2" (UID: "f078acc0-81e3-454d-bc9b-9df56ffe67e2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.533227 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6xv7\" (UniqueName: \"kubernetes.io/projected/f078acc0-81e3-454d-bc9b-9df56ffe67e2-kube-api-access-z6xv7\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.533265 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.533279 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.533291 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.533304 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f078acc0-81e3-454d-bc9b-9df56ffe67e2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.533316 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.543895 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-config-data" (OuterVolumeSpecName: "config-data") pod "f078acc0-81e3-454d-bc9b-9df56ffe67e2" (UID: "f078acc0-81e3-454d-bc9b-9df56ffe67e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.551029 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f078acc0-81e3-454d-bc9b-9df56ffe67e2" (UID: "f078acc0-81e3-454d-bc9b-9df56ffe67e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.635487 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.635531 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f078acc0-81e3-454d-bc9b-9df56ffe67e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.680045 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"f078acc0-81e3-454d-bc9b-9df56ffe67e2","Type":"ContainerDied","Data":"0d94dd9a1561360a58a1d9a65e9048ccad6a453b45327aef9b9b6af6a5f18129"} Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.680102 4794 scope.go:117] "RemoveContainer" containerID="061115ffb4b209020421edbed24a2d3b3ace39c3f860d5560d4b3ea4465ae2c0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.680164 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.713106 4794 scope.go:117] "RemoveContainer" containerID="be8c4673395a7af0f19bc4b6aa2cc0ae56f124400136d89595abf329ffe8af33" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.716721 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.723167 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.737648 4794 scope.go:117] "RemoveContainer" containerID="d951b0c6a69389f0c4030866e300afb5908dce8cb63ab4c1ac1b60ddbeed6379" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.752830 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" path="/var/lib/kubelet/pods/f078acc0-81e3-454d-bc9b-9df56ffe67e2/volumes" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.753996 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:21:44 crc kubenswrapper[4794]: E1215 14:21:44.754321 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="sg-core" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.754341 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="sg-core" Dec 15 14:21:44 crc kubenswrapper[4794]: E1215 14:21:44.754355 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="proxy-httpd" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.754374 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="proxy-httpd" Dec 15 14:21:44 crc kubenswrapper[4794]: E1215 14:21:44.754401 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="ceilometer-notification-agent" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.754409 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="ceilometer-notification-agent" Dec 15 14:21:44 crc kubenswrapper[4794]: E1215 14:21:44.754427 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="ceilometer-central-agent" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.754435 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="ceilometer-central-agent" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.754664 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="ceilometer-central-agent" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.754690 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="sg-core" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.754701 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="proxy-httpd" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.754714 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f078acc0-81e3-454d-bc9b-9df56ffe67e2" containerName="ceilometer-notification-agent" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.757131 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.759919 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.760025 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.760173 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.763253 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.770726 4794 scope.go:117] "RemoveContainer" containerID="f89da186bc142f85608885a85c81d17b57097f98850d547dac48ab9eea169d77" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.837899 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.837955 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-run-httpd\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.838036 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgmps\" (UniqueName: \"kubernetes.io/projected/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-kube-api-access-fgmps\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.838068 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.838111 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-config-data\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.838316 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-scripts\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.838389 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.838560 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-log-httpd\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.940018 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-log-httpd\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.940109 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.940145 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-run-httpd\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.940188 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgmps\" (UniqueName: \"kubernetes.io/projected/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-kube-api-access-fgmps\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.940214 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.940255 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-config-data\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.940303 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-scripts\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.940335 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.940477 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-run-httpd\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.940503 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-log-httpd\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.945024 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.945085 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-config-data\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.945555 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.946729 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-scripts\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.949183 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:44 crc kubenswrapper[4794]: I1215 14:21:44.958836 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgmps\" (UniqueName: \"kubernetes.io/projected/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-kube-api-access-fgmps\") pod \"ceilometer-0\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:45 crc kubenswrapper[4794]: I1215 14:21:45.085125 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:45 crc kubenswrapper[4794]: I1215 14:21:45.510064 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:21:45 crc kubenswrapper[4794]: I1215 14:21:45.694188 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c","Type":"ContainerStarted","Data":"a3fd2152eedd4456113f6b8285ae3f1ed7ae3d0609807a8e7530134a4ead9b05"} Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.249262 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.705558 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c","Type":"ContainerStarted","Data":"99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56"} Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.823603 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.825215 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.838839 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.871311 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.871360 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.871520 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909724e6-74b1-4a24-a66d-79ab7b2d7801-logs\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.871599 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn7zk\" (UniqueName: \"kubernetes.io/projected/909724e6-74b1-4a24-a66d-79ab7b2d7801-kube-api-access-pn7zk\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.871629 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.871663 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.973229 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.973312 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.973401 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909724e6-74b1-4a24-a66d-79ab7b2d7801-logs\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.973467 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn7zk\" (UniqueName: \"kubernetes.io/projected/909724e6-74b1-4a24-a66d-79ab7b2d7801-kube-api-access-pn7zk\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.973492 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.973538 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.975153 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909724e6-74b1-4a24-a66d-79ab7b2d7801-logs\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.979677 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.981023 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.991341 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.991428 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:46 crc kubenswrapper[4794]: I1215 14:21:46.994680 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn7zk\" (UniqueName: \"kubernetes.io/projected/909724e6-74b1-4a24-a66d-79ab7b2d7801-kube-api-access-pn7zk\") pod \"watcher-kuttl-api-2\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:47 crc kubenswrapper[4794]: I1215 14:21:47.145168 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:47.716754 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c","Type":"ContainerStarted","Data":"397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99"} Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:47.869088 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ftr6k"] Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:47.871243 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:47.909149 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftr6k"] Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:47.993746 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs8cz\" (UniqueName: \"kubernetes.io/projected/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-kube-api-access-fs8cz\") pod \"certified-operators-ftr6k\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:47.993832 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-utilities\") pod \"certified-operators-ftr6k\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:47.993918 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-catalog-content\") pod \"certified-operators-ftr6k\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.095200 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-utilities\") pod \"certified-operators-ftr6k\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.095682 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-catalog-content\") pod \"certified-operators-ftr6k\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.095785 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs8cz\" (UniqueName: \"kubernetes.io/projected/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-kube-api-access-fs8cz\") pod \"certified-operators-ftr6k\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.095815 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-utilities\") pod \"certified-operators-ftr6k\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.096141 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-catalog-content\") pod \"certified-operators-ftr6k\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.129553 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs8cz\" (UniqueName: \"kubernetes.io/projected/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-kube-api-access-fs8cz\") pod \"certified-operators-ftr6k\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.201472 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.349626 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.699330 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftr6k"] Dec 15 14:21:48 crc kubenswrapper[4794]: W1215 14:21:48.699448 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c30a2c3_4d23_4189_90c5_14bb73c79c5d.slice/crio-2c093c9492de1fc0f5363f6bb646fdd226247596780afb81233478c313d9b0d4 WatchSource:0}: Error finding container 2c093c9492de1fc0f5363f6bb646fdd226247596780afb81233478c313d9b0d4: Status 404 returned error can't find the container with id 2c093c9492de1fc0f5363f6bb646fdd226247596780afb81233478c313d9b0d4 Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.761323 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"909724e6-74b1-4a24-a66d-79ab7b2d7801","Type":"ContainerStarted","Data":"66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29"} Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.761385 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"909724e6-74b1-4a24-a66d-79ab7b2d7801","Type":"ContainerStarted","Data":"30f51270c8c46ca00f28e17da9650b65be0411962a4cf49da27bc2c2f95c6aca"} Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.761402 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c","Type":"ContainerStarted","Data":"90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a"} Dec 15 14:21:48 crc kubenswrapper[4794]: I1215 14:21:48.761414 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftr6k" event={"ID":"8c30a2c3-4d23-4189-90c5-14bb73c79c5d","Type":"ContainerStarted","Data":"2c093c9492de1fc0f5363f6bb646fdd226247596780afb81233478c313d9b0d4"} Dec 15 14:21:49 crc kubenswrapper[4794]: I1215 14:21:49.749888 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"909724e6-74b1-4a24-a66d-79ab7b2d7801","Type":"ContainerStarted","Data":"3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f"} Dec 15 14:21:49 crc kubenswrapper[4794]: I1215 14:21:49.750183 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:49 crc kubenswrapper[4794]: I1215 14:21:49.753070 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c","Type":"ContainerStarted","Data":"d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d"} Dec 15 14:21:49 crc kubenswrapper[4794]: I1215 14:21:49.753239 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:21:49 crc kubenswrapper[4794]: I1215 14:21:49.757216 4794 generic.go:334] "Generic (PLEG): container finished" podID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" containerID="17a8c8d60d95f6403e0d05f8edfcc026c2457a618ff238e2ab93fcee843f521a" exitCode=0 Dec 15 14:21:49 crc kubenswrapper[4794]: I1215 14:21:49.757271 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftr6k" event={"ID":"8c30a2c3-4d23-4189-90c5-14bb73c79c5d","Type":"ContainerDied","Data":"17a8c8d60d95f6403e0d05f8edfcc026c2457a618ff238e2ab93fcee843f521a"} Dec 15 14:21:49 crc kubenswrapper[4794]: I1215 14:21:49.774075 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-2" podStartSLOduration=3.774058133 podStartE2EDuration="3.774058133s" podCreationTimestamp="2025-12-15 14:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:21:49.770987017 +0000 UTC m=+1671.623009475" watchObservedRunningTime="2025-12-15 14:21:49.774058133 +0000 UTC m=+1671.626080571" Dec 15 14:21:49 crc kubenswrapper[4794]: I1215 14:21:49.826146 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.436641383 podStartE2EDuration="5.826129057s" podCreationTimestamp="2025-12-15 14:21:44 +0000 UTC" firstStartedPulling="2025-12-15 14:21:45.522696128 +0000 UTC m=+1667.374718566" lastFinishedPulling="2025-12-15 14:21:48.912183802 +0000 UTC m=+1670.764206240" observedRunningTime="2025-12-15 14:21:49.819013756 +0000 UTC m=+1671.671036214" watchObservedRunningTime="2025-12-15 14:21:49.826129057 +0000 UTC m=+1671.678151515" Dec 15 14:21:50 crc kubenswrapper[4794]: I1215 14:21:50.768984 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftr6k" event={"ID":"8c30a2c3-4d23-4189-90c5-14bb73c79c5d","Type":"ContainerStarted","Data":"a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7"} Dec 15 14:21:51 crc kubenswrapper[4794]: I1215 14:21:51.778064 4794 generic.go:334] "Generic (PLEG): container finished" podID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" containerID="a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7" exitCode=0 Dec 15 14:21:51 crc kubenswrapper[4794]: I1215 14:21:51.778367 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftr6k" event={"ID":"8c30a2c3-4d23-4189-90c5-14bb73c79c5d","Type":"ContainerDied","Data":"a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7"} Dec 15 14:21:52 crc kubenswrapper[4794]: I1215 14:21:52.025681 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:52 crc kubenswrapper[4794]: I1215 14:21:52.146122 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:52 crc kubenswrapper[4794]: I1215 14:21:52.816022 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftr6k" event={"ID":"8c30a2c3-4d23-4189-90c5-14bb73c79c5d","Type":"ContainerStarted","Data":"8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2"} Dec 15 14:21:52 crc kubenswrapper[4794]: I1215 14:21:52.842237 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ftr6k" podStartSLOduration=3.135858303 podStartE2EDuration="5.842213803s" podCreationTimestamp="2025-12-15 14:21:47 +0000 UTC" firstStartedPulling="2025-12-15 14:21:49.758484603 +0000 UTC m=+1671.610507041" lastFinishedPulling="2025-12-15 14:21:52.464840103 +0000 UTC m=+1674.316862541" observedRunningTime="2025-12-15 14:21:52.841105612 +0000 UTC m=+1674.693128070" watchObservedRunningTime="2025-12-15 14:21:52.842213803 +0000 UTC m=+1674.694236251" Dec 15 14:21:53 crc kubenswrapper[4794]: I1215 14:21:53.737242 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:21:53 crc kubenswrapper[4794]: E1215 14:21:53.737455 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:21:57 crc kubenswrapper[4794]: I1215 14:21:57.146043 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:57 crc kubenswrapper[4794]: I1215 14:21:57.149678 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:57 crc kubenswrapper[4794]: I1215 14:21:57.868157 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:21:58 crc kubenswrapper[4794]: I1215 14:21:58.202393 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:58 crc kubenswrapper[4794]: I1215 14:21:58.202446 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:58 crc kubenswrapper[4794]: I1215 14:21:58.252537 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:58 crc kubenswrapper[4794]: I1215 14:21:58.323437 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 15 14:21:58 crc kubenswrapper[4794]: I1215 14:21:58.329537 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:21:58 crc kubenswrapper[4794]: I1215 14:21:58.330031 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerName="watcher-kuttl-api-log" containerID="cri-o://6c2891f7cdbbb90d825577bdc72819914f0867412d9569d5960257ea69a4870c" gracePeriod=30 Dec 15 14:21:58 crc kubenswrapper[4794]: I1215 14:21:58.330564 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerName="watcher-api" containerID="cri-o://e6e1264948080ec083ed47e7d3dbae0ec674264e29171647a325e8b51d8bb345" gracePeriod=30 Dec 15 14:21:58 crc kubenswrapper[4794]: I1215 14:21:58.962259 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:21:59 crc kubenswrapper[4794]: I1215 14:21:59.183225 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.178:9322/\": dial tcp 10.217.0.178:9322: connect: connection refused" Dec 15 14:21:59 crc kubenswrapper[4794]: I1215 14:21:59.183278 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9322/\": dial tcp 10.217.0.178:9322: connect: connection refused" Dec 15 14:21:59 crc kubenswrapper[4794]: I1215 14:21:59.903498 4794 generic.go:334] "Generic (PLEG): container finished" podID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerID="e6e1264948080ec083ed47e7d3dbae0ec674264e29171647a325e8b51d8bb345" exitCode=0 Dec 15 14:21:59 crc kubenswrapper[4794]: I1215 14:21:59.903525 4794 generic.go:334] "Generic (PLEG): container finished" podID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerID="6c2891f7cdbbb90d825577bdc72819914f0867412d9569d5960257ea69a4870c" exitCode=143 Dec 15 14:21:59 crc kubenswrapper[4794]: I1215 14:21:59.904289 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0","Type":"ContainerDied","Data":"e6e1264948080ec083ed47e7d3dbae0ec674264e29171647a325e8b51d8bb345"} Dec 15 14:21:59 crc kubenswrapper[4794]: I1215 14:21:59.904311 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0","Type":"ContainerDied","Data":"6c2891f7cdbbb90d825577bdc72819914f0867412d9569d5960257ea69a4870c"} Dec 15 14:21:59 crc kubenswrapper[4794]: I1215 14:21:59.904416 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-2" podUID="909724e6-74b1-4a24-a66d-79ab7b2d7801" containerName="watcher-kuttl-api-log" containerID="cri-o://66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29" gracePeriod=30 Dec 15 14:21:59 crc kubenswrapper[4794]: I1215 14:21:59.904728 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-2" podUID="909724e6-74b1-4a24-a66d-79ab7b2d7801" containerName="watcher-api" containerID="cri-o://3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f" gracePeriod=30 Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.060427 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.095167 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-config-data\") pod \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.095221 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-combined-ca-bundle\") pod \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.095249 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-custom-prometheus-ca\") pod \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.095268 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvlr5\" (UniqueName: \"kubernetes.io/projected/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-kube-api-access-kvlr5\") pod \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.095287 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-cert-memcached-mtls\") pod \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.131992 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" (UID: "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.132099 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" (UID: "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.132258 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-kube-api-access-kvlr5" (OuterVolumeSpecName: "kube-api-access-kvlr5") pod "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" (UID: "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0"). InnerVolumeSpecName "kube-api-access-kvlr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.159555 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-config-data" (OuterVolumeSpecName: "config-data") pod "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" (UID: "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.180318 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" (UID: "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.196860 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-logs\") pod \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\" (UID: \"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.197562 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.197608 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.197624 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.197637 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvlr5\" (UniqueName: \"kubernetes.io/projected/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-kube-api-access-kvlr5\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.197648 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.197625 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-logs" (OuterVolumeSpecName: "logs") pod "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" (UID: "cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.298827 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.646311 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.815697 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-config-data\") pod \"909724e6-74b1-4a24-a66d-79ab7b2d7801\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.816281 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-cert-memcached-mtls\") pod \"909724e6-74b1-4a24-a66d-79ab7b2d7801\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.816682 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-combined-ca-bundle\") pod \"909724e6-74b1-4a24-a66d-79ab7b2d7801\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.816787 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909724e6-74b1-4a24-a66d-79ab7b2d7801-logs\") pod \"909724e6-74b1-4a24-a66d-79ab7b2d7801\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.816887 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-custom-prometheus-ca\") pod \"909724e6-74b1-4a24-a66d-79ab7b2d7801\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.817056 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn7zk\" (UniqueName: \"kubernetes.io/projected/909724e6-74b1-4a24-a66d-79ab7b2d7801-kube-api-access-pn7zk\") pod \"909724e6-74b1-4a24-a66d-79ab7b2d7801\" (UID: \"909724e6-74b1-4a24-a66d-79ab7b2d7801\") " Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.817400 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909724e6-74b1-4a24-a66d-79ab7b2d7801-logs" (OuterVolumeSpecName: "logs") pod "909724e6-74b1-4a24-a66d-79ab7b2d7801" (UID: "909724e6-74b1-4a24-a66d-79ab7b2d7801"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.817803 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909724e6-74b1-4a24-a66d-79ab7b2d7801-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.820272 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909724e6-74b1-4a24-a66d-79ab7b2d7801-kube-api-access-pn7zk" (OuterVolumeSpecName: "kube-api-access-pn7zk") pod "909724e6-74b1-4a24-a66d-79ab7b2d7801" (UID: "909724e6-74b1-4a24-a66d-79ab7b2d7801"). InnerVolumeSpecName "kube-api-access-pn7zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.840241 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "909724e6-74b1-4a24-a66d-79ab7b2d7801" (UID: "909724e6-74b1-4a24-a66d-79ab7b2d7801"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.852726 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "909724e6-74b1-4a24-a66d-79ab7b2d7801" (UID: "909724e6-74b1-4a24-a66d-79ab7b2d7801"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.860476 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-config-data" (OuterVolumeSpecName: "config-data") pod "909724e6-74b1-4a24-a66d-79ab7b2d7801" (UID: "909724e6-74b1-4a24-a66d-79ab7b2d7801"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.896167 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "909724e6-74b1-4a24-a66d-79ab7b2d7801" (UID: "909724e6-74b1-4a24-a66d-79ab7b2d7801"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919531 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919567 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919633 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919650 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/909724e6-74b1-4a24-a66d-79ab7b2d7801-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919644 4794 generic.go:334] "Generic (PLEG): container finished" podID="909724e6-74b1-4a24-a66d-79ab7b2d7801" containerID="3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f" exitCode=0 Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919670 4794 generic.go:334] "Generic (PLEG): container finished" podID="909724e6-74b1-4a24-a66d-79ab7b2d7801" containerID="66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29" exitCode=143 Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919677 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"909724e6-74b1-4a24-a66d-79ab7b2d7801","Type":"ContainerDied","Data":"3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f"} Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919713 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"909724e6-74b1-4a24-a66d-79ab7b2d7801","Type":"ContainerDied","Data":"66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29"} Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919663 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn7zk\" (UniqueName: \"kubernetes.io/projected/909724e6-74b1-4a24-a66d-79ab7b2d7801-kube-api-access-pn7zk\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919727 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"909724e6-74b1-4a24-a66d-79ab7b2d7801","Type":"ContainerDied","Data":"30f51270c8c46ca00f28e17da9650b65be0411962a4cf49da27bc2c2f95c6aca"} Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919736 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.919747 4794 scope.go:117] "RemoveContainer" containerID="3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.922887 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0","Type":"ContainerDied","Data":"345caece2e6ed05e9126c7d859ef39a6c388fd5a50f854fbbf95abfda9cdcb76"} Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.922972 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.960373 4794 scope.go:117] "RemoveContainer" containerID="66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.974866 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.988371 4794 scope.go:117] "RemoveContainer" containerID="3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.988486 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:22:00 crc kubenswrapper[4794]: E1215 14:22:00.989117 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f\": container with ID starting with 3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f not found: ID does not exist" containerID="3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.989161 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f"} err="failed to get container status \"3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f\": rpc error: code = NotFound desc = could not find container \"3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f\": container with ID starting with 3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f not found: ID does not exist" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.989199 4794 scope.go:117] "RemoveContainer" containerID="66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29" Dec 15 14:22:00 crc kubenswrapper[4794]: E1215 14:22:00.989677 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29\": container with ID starting with 66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29 not found: ID does not exist" containerID="66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.989702 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29"} err="failed to get container status \"66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29\": rpc error: code = NotFound desc = could not find container \"66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29\": container with ID starting with 66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29 not found: ID does not exist" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.989728 4794 scope.go:117] "RemoveContainer" containerID="3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.989980 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f"} err="failed to get container status \"3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f\": rpc error: code = NotFound desc = could not find container \"3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f\": container with ID starting with 3255d00c11a5d48a063bf27f5d222b8ed21a5e84825be340d59d106370c0278f not found: ID does not exist" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.990001 4794 scope.go:117] "RemoveContainer" containerID="66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.990253 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29"} err="failed to get container status \"66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29\": rpc error: code = NotFound desc = could not find container \"66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29\": container with ID starting with 66b8fee31d9c6465a066cc611ed74d1cfb53d1cad4cb94906fc3d82f8f0dae29 not found: ID does not exist" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.990273 4794 scope.go:117] "RemoveContainer" containerID="e6e1264948080ec083ed47e7d3dbae0ec674264e29171647a325e8b51d8bb345" Dec 15 14:22:00 crc kubenswrapper[4794]: I1215 14:22:00.999608 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 15 14:22:01 crc kubenswrapper[4794]: I1215 14:22:01.009250 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Dec 15 14:22:01 crc kubenswrapper[4794]: I1215 14:22:01.053919 4794 scope.go:117] "RemoveContainer" containerID="6c2891f7cdbbb90d825577bdc72819914f0867412d9569d5960257ea69a4870c" Dec 15 14:22:01 crc kubenswrapper[4794]: I1215 14:22:01.590163 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:22:01 crc kubenswrapper[4794]: I1215 14:22:01.590386 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="32c71f2d-c68f-4021-8a30-2ea58da786c5" containerName="watcher-kuttl-api-log" containerID="cri-o://772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664" gracePeriod=30 Dec 15 14:22:01 crc kubenswrapper[4794]: I1215 14:22:01.590519 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="32c71f2d-c68f-4021-8a30-2ea58da786c5" containerName="watcher-api" containerID="cri-o://18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6" gracePeriod=30 Dec 15 14:22:01 crc kubenswrapper[4794]: I1215 14:22:01.858256 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftr6k"] Dec 15 14:22:01 crc kubenswrapper[4794]: I1215 14:22:01.858484 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ftr6k" podUID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" containerName="registry-server" containerID="cri-o://8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2" gracePeriod=2 Dec 15 14:22:01 crc kubenswrapper[4794]: I1215 14:22:01.933984 4794 generic.go:334] "Generic (PLEG): container finished" podID="32c71f2d-c68f-4021-8a30-2ea58da786c5" containerID="772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664" exitCode=143 Dec 15 14:22:01 crc kubenswrapper[4794]: I1215 14:22:01.934031 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"32c71f2d-c68f-4021-8a30-2ea58da786c5","Type":"ContainerDied","Data":"772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664"} Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.335155 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.441536 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-utilities\") pod \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.441742 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs8cz\" (UniqueName: \"kubernetes.io/projected/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-kube-api-access-fs8cz\") pod \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.441795 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-catalog-content\") pod \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\" (UID: \"8c30a2c3-4d23-4189-90c5-14bb73c79c5d\") " Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.442597 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-utilities" (OuterVolumeSpecName: "utilities") pod "8c30a2c3-4d23-4189-90c5-14bb73c79c5d" (UID: "8c30a2c3-4d23-4189-90c5-14bb73c79c5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.447503 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-kube-api-access-fs8cz" (OuterVolumeSpecName: "kube-api-access-fs8cz") pod "8c30a2c3-4d23-4189-90c5-14bb73c79c5d" (UID: "8c30a2c3-4d23-4189-90c5-14bb73c79c5d"). InnerVolumeSpecName "kube-api-access-fs8cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.492866 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c30a2c3-4d23-4189-90c5-14bb73c79c5d" (UID: "8c30a2c3-4d23-4189-90c5-14bb73c79c5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.543842 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs8cz\" (UniqueName: \"kubernetes.io/projected/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-kube-api-access-fs8cz\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.543875 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.543888 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c30a2c3-4d23-4189-90c5-14bb73c79c5d-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.754067 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909724e6-74b1-4a24-a66d-79ab7b2d7801" path="/var/lib/kubelet/pods/909724e6-74b1-4a24-a66d-79ab7b2d7801/volumes" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.755285 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" path="/var/lib/kubelet/pods/cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0/volumes" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.756124 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zblqx"] Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.769231 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-zblqx"] Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.825373 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.825985 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="f3b646ec-0aff-43e4-97b2-991ca571ff83" containerName="watcher-applier" containerID="cri-o://95cac5963ffcb236b12be762058907ecdc7ffb281c371fa78894f9c17044cbf3" gracePeriod=30 Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842292 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher2cb3-account-delete-cqxp6"] Dec 15 14:22:02 crc kubenswrapper[4794]: E1215 14:22:02.842648 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" containerName="extract-content" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842666 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" containerName="extract-content" Dec 15 14:22:02 crc kubenswrapper[4794]: E1215 14:22:02.842682 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909724e6-74b1-4a24-a66d-79ab7b2d7801" containerName="watcher-api" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842688 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="909724e6-74b1-4a24-a66d-79ab7b2d7801" containerName="watcher-api" Dec 15 14:22:02 crc kubenswrapper[4794]: E1215 14:22:02.842697 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909724e6-74b1-4a24-a66d-79ab7b2d7801" containerName="watcher-kuttl-api-log" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842704 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="909724e6-74b1-4a24-a66d-79ab7b2d7801" containerName="watcher-kuttl-api-log" Dec 15 14:22:02 crc kubenswrapper[4794]: E1215 14:22:02.842713 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerName="watcher-kuttl-api-log" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842720 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerName="watcher-kuttl-api-log" Dec 15 14:22:02 crc kubenswrapper[4794]: E1215 14:22:02.842735 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" containerName="extract-utilities" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842741 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" containerName="extract-utilities" Dec 15 14:22:02 crc kubenswrapper[4794]: E1215 14:22:02.842751 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerName="watcher-api" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842756 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerName="watcher-api" Dec 15 14:22:02 crc kubenswrapper[4794]: E1215 14:22:02.842766 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" containerName="registry-server" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842772 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" containerName="registry-server" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842933 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerName="watcher-api" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842952 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdbadb79-adaf-4e3b-a3a4-ed5b92a18dc0" containerName="watcher-kuttl-api-log" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842961 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" containerName="registry-server" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842973 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="909724e6-74b1-4a24-a66d-79ab7b2d7801" containerName="watcher-kuttl-api-log" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.842982 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="909724e6-74b1-4a24-a66d-79ab7b2d7801" containerName="watcher-api" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.843538 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2cb3-account-delete-cqxp6" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.856314 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher2cb3-account-delete-cqxp6"] Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.879389 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.879723 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" containerName="watcher-decision-engine" containerID="cri-o://46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881" gracePeriod=30 Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.880772 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.952619 4794 generic.go:334] "Generic (PLEG): container finished" podID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" containerID="8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2" exitCode=0 Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.952663 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftr6k" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.952696 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftr6k" event={"ID":"8c30a2c3-4d23-4189-90c5-14bb73c79c5d","Type":"ContainerDied","Data":"8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2"} Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.952765 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftr6k" event={"ID":"8c30a2c3-4d23-4189-90c5-14bb73c79c5d","Type":"ContainerDied","Data":"2c093c9492de1fc0f5363f6bb646fdd226247596780afb81233478c313d9b0d4"} Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.952792 4794 scope.go:117] "RemoveContainer" containerID="8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.953853 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2qgb\" (UniqueName: \"kubernetes.io/projected/d6c0aa1c-83fd-45e4-8213-d602734eaefd-kube-api-access-n2qgb\") pod \"watcher2cb3-account-delete-cqxp6\" (UID: \"d6c0aa1c-83fd-45e4-8213-d602734eaefd\") " pod="watcher-kuttl-default/watcher2cb3-account-delete-cqxp6" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.958956 4794 generic.go:334] "Generic (PLEG): container finished" podID="32c71f2d-c68f-4021-8a30-2ea58da786c5" containerID="18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6" exitCode=0 Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.958996 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"32c71f2d-c68f-4021-8a30-2ea58da786c5","Type":"ContainerDied","Data":"18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6"} Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.959024 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"32c71f2d-c68f-4021-8a30-2ea58da786c5","Type":"ContainerDied","Data":"7ba554d7e431d124db174289cf0411a2f96505a725d8d124866338a9beeff6f1"} Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.959088 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.974437 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftr6k"] Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.980773 4794 scope.go:117] "RemoveContainer" containerID="a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7" Dec 15 14:22:02 crc kubenswrapper[4794]: I1215 14:22:02.981041 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ftr6k"] Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.016484 4794 scope.go:117] "RemoveContainer" containerID="17a8c8d60d95f6403e0d05f8edfcc026c2457a618ff238e2ab93fcee843f521a" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.054948 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-cert-memcached-mtls\") pod \"32c71f2d-c68f-4021-8a30-2ea58da786c5\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.054982 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-custom-prometheus-ca\") pod \"32c71f2d-c68f-4021-8a30-2ea58da786c5\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.055077 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-config-data\") pod \"32c71f2d-c68f-4021-8a30-2ea58da786c5\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.055143 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c71f2d-c68f-4021-8a30-2ea58da786c5-logs\") pod \"32c71f2d-c68f-4021-8a30-2ea58da786c5\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.055197 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbhqb\" (UniqueName: \"kubernetes.io/projected/32c71f2d-c68f-4021-8a30-2ea58da786c5-kube-api-access-lbhqb\") pod \"32c71f2d-c68f-4021-8a30-2ea58da786c5\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.055279 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-combined-ca-bundle\") pod \"32c71f2d-c68f-4021-8a30-2ea58da786c5\" (UID: \"32c71f2d-c68f-4021-8a30-2ea58da786c5\") " Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.055645 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2qgb\" (UniqueName: \"kubernetes.io/projected/d6c0aa1c-83fd-45e4-8213-d602734eaefd-kube-api-access-n2qgb\") pod \"watcher2cb3-account-delete-cqxp6\" (UID: \"d6c0aa1c-83fd-45e4-8213-d602734eaefd\") " pod="watcher-kuttl-default/watcher2cb3-account-delete-cqxp6" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.055789 4794 scope.go:117] "RemoveContainer" containerID="8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2" Dec 15 14:22:03 crc kubenswrapper[4794]: E1215 14:22:03.056728 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2\": container with ID starting with 8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2 not found: ID does not exist" containerID="8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.056775 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c71f2d-c68f-4021-8a30-2ea58da786c5-logs" (OuterVolumeSpecName: "logs") pod "32c71f2d-c68f-4021-8a30-2ea58da786c5" (UID: "32c71f2d-c68f-4021-8a30-2ea58da786c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.056783 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2"} err="failed to get container status \"8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2\": rpc error: code = NotFound desc = could not find container \"8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2\": container with ID starting with 8f91ef2e0a657888e05d44b2bea07e177c1485ba5374db1f05f1b6f222a8f2c2 not found: ID does not exist" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.056816 4794 scope.go:117] "RemoveContainer" containerID="a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7" Dec 15 14:22:03 crc kubenswrapper[4794]: E1215 14:22:03.060612 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7\": container with ID starting with a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7 not found: ID does not exist" containerID="a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.060665 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7"} err="failed to get container status \"a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7\": rpc error: code = NotFound desc = could not find container \"a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7\": container with ID starting with a138472cc8783c5c8e618c20fa9560f9727853703c7b36effcdb93b4cf91ebc7 not found: ID does not exist" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.060695 4794 scope.go:117] "RemoveContainer" containerID="17a8c8d60d95f6403e0d05f8edfcc026c2457a618ff238e2ab93fcee843f521a" Dec 15 14:22:03 crc kubenswrapper[4794]: E1215 14:22:03.064079 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a8c8d60d95f6403e0d05f8edfcc026c2457a618ff238e2ab93fcee843f521a\": container with ID starting with 17a8c8d60d95f6403e0d05f8edfcc026c2457a618ff238e2ab93fcee843f521a not found: ID does not exist" containerID="17a8c8d60d95f6403e0d05f8edfcc026c2457a618ff238e2ab93fcee843f521a" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.064119 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a8c8d60d95f6403e0d05f8edfcc026c2457a618ff238e2ab93fcee843f521a"} err="failed to get container status \"17a8c8d60d95f6403e0d05f8edfcc026c2457a618ff238e2ab93fcee843f521a\": rpc error: code = NotFound desc = could not find container \"17a8c8d60d95f6403e0d05f8edfcc026c2457a618ff238e2ab93fcee843f521a\": container with ID starting with 17a8c8d60d95f6403e0d05f8edfcc026c2457a618ff238e2ab93fcee843f521a not found: ID does not exist" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.064147 4794 scope.go:117] "RemoveContainer" containerID="18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.074814 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2qgb\" (UniqueName: \"kubernetes.io/projected/d6c0aa1c-83fd-45e4-8213-d602734eaefd-kube-api-access-n2qgb\") pod \"watcher2cb3-account-delete-cqxp6\" (UID: \"d6c0aa1c-83fd-45e4-8213-d602734eaefd\") " pod="watcher-kuttl-default/watcher2cb3-account-delete-cqxp6" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.076771 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c71f2d-c68f-4021-8a30-2ea58da786c5-kube-api-access-lbhqb" (OuterVolumeSpecName: "kube-api-access-lbhqb") pod "32c71f2d-c68f-4021-8a30-2ea58da786c5" (UID: "32c71f2d-c68f-4021-8a30-2ea58da786c5"). InnerVolumeSpecName "kube-api-access-lbhqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.084683 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32c71f2d-c68f-4021-8a30-2ea58da786c5" (UID: "32c71f2d-c68f-4021-8a30-2ea58da786c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.094904 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "32c71f2d-c68f-4021-8a30-2ea58da786c5" (UID: "32c71f2d-c68f-4021-8a30-2ea58da786c5"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.154715 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-config-data" (OuterVolumeSpecName: "config-data") pod "32c71f2d-c68f-4021-8a30-2ea58da786c5" (UID: "32c71f2d-c68f-4021-8a30-2ea58da786c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.165741 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbhqb\" (UniqueName: \"kubernetes.io/projected/32c71f2d-c68f-4021-8a30-2ea58da786c5-kube-api-access-lbhqb\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.165803 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.165818 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.165829 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.165847 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32c71f2d-c68f-4021-8a30-2ea58da786c5-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.176746 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "32c71f2d-c68f-4021-8a30-2ea58da786c5" (UID: "32c71f2d-c68f-4021-8a30-2ea58da786c5"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.192796 4794 scope.go:117] "RemoveContainer" containerID="772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.202944 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2cb3-account-delete-cqxp6" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.212198 4794 scope.go:117] "RemoveContainer" containerID="18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6" Dec 15 14:22:03 crc kubenswrapper[4794]: E1215 14:22:03.212532 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6\": container with ID starting with 18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6 not found: ID does not exist" containerID="18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.212567 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6"} err="failed to get container status \"18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6\": rpc error: code = NotFound desc = could not find container \"18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6\": container with ID starting with 18e656fa46d7b2702c76b6cc6307667bf446ea4215a2b9c9650f05841c9ef8b6 not found: ID does not exist" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.212653 4794 scope.go:117] "RemoveContainer" containerID="772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664" Dec 15 14:22:03 crc kubenswrapper[4794]: E1215 14:22:03.212872 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664\": container with ID starting with 772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664 not found: ID does not exist" containerID="772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.212899 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664"} err="failed to get container status \"772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664\": rpc error: code = NotFound desc = could not find container \"772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664\": container with ID starting with 772c308c11dde92d88fb6f9c628a5e6c3110a8d890e4e4a3de2924738894f664 not found: ID does not exist" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.271198 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/32c71f2d-c68f-4021-8a30-2ea58da786c5-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.301571 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.306346 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.685291 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher2cb3-account-delete-cqxp6"] Dec 15 14:22:03 crc kubenswrapper[4794]: W1215 14:22:03.686461 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6c0aa1c_83fd_45e4_8213_d602734eaefd.slice/crio-cf04fc57a5e159677a7f01dfc780b6997cfdfae6dea43af8fc8d69ff403cac00 WatchSource:0}: Error finding container cf04fc57a5e159677a7f01dfc780b6997cfdfae6dea43af8fc8d69ff403cac00: Status 404 returned error can't find the container with id cf04fc57a5e159677a7f01dfc780b6997cfdfae6dea43af8fc8d69ff403cac00 Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.966828 4794 generic.go:334] "Generic (PLEG): container finished" podID="d6c0aa1c-83fd-45e4-8213-d602734eaefd" containerID="8907a4744e92f2d4e3d7dbad46eb06efede8489b96338d6a9059fdbfd4d25658" exitCode=0 Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.966874 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2cb3-account-delete-cqxp6" event={"ID":"d6c0aa1c-83fd-45e4-8213-d602734eaefd","Type":"ContainerDied","Data":"8907a4744e92f2d4e3d7dbad46eb06efede8489b96338d6a9059fdbfd4d25658"} Dec 15 14:22:03 crc kubenswrapper[4794]: I1215 14:22:03.967135 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2cb3-account-delete-cqxp6" event={"ID":"d6c0aa1c-83fd-45e4-8213-d602734eaefd","Type":"ContainerStarted","Data":"cf04fc57a5e159677a7f01dfc780b6997cfdfae6dea43af8fc8d69ff403cac00"} Dec 15 14:22:04 crc kubenswrapper[4794]: E1215 14:22:04.162025 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95cac5963ffcb236b12be762058907ecdc7ffb281c371fa78894f9c17044cbf3" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:22:04 crc kubenswrapper[4794]: E1215 14:22:04.163541 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95cac5963ffcb236b12be762058907ecdc7ffb281c371fa78894f9c17044cbf3" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:22:04 crc kubenswrapper[4794]: E1215 14:22:04.165028 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95cac5963ffcb236b12be762058907ecdc7ffb281c371fa78894f9c17044cbf3" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:22:04 crc kubenswrapper[4794]: E1215 14:22:04.165074 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="f3b646ec-0aff-43e4-97b2-991ca571ff83" containerName="watcher-applier" Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.745136 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32c71f2d-c68f-4021-8a30-2ea58da786c5" path="/var/lib/kubelet/pods/32c71f2d-c68f-4021-8a30-2ea58da786c5/volumes" Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.746014 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c30a2c3-4d23-4189-90c5-14bb73c79c5d" path="/var/lib/kubelet/pods/8c30a2c3-4d23-4189-90c5-14bb73c79c5d/volumes" Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.746630 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3075f03-ed4f-4c07-9045-cc157ff544f3" path="/var/lib/kubelet/pods/c3075f03-ed4f-4c07-9045-cc157ff544f3/volumes" Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.828249 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.977825 4794 generic.go:334] "Generic (PLEG): container finished" podID="9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" containerID="46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881" exitCode=0 Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.977891 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.977886 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a","Type":"ContainerDied","Data":"46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881"} Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.977952 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a","Type":"ContainerDied","Data":"ad31122ee09d9147b422985eb33efa877aa9fabefb194845df503f847c33904d"} Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.977975 4794 scope.go:117] "RemoveContainer" containerID="46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881" Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.997772 4794 scope.go:117] "RemoveContainer" containerID="46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881" Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.997790 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-combined-ca-bundle\") pod \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.997865 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-logs\") pod \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.997917 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzhnn\" (UniqueName: \"kubernetes.io/projected/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-kube-api-access-rzhnn\") pod \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.997950 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-custom-prometheus-ca\") pod \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.998042 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-cert-memcached-mtls\") pod \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.998130 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-config-data\") pod \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\" (UID: \"9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a\") " Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.998297 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-logs" (OuterVolumeSpecName: "logs") pod "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" (UID: "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:22:04 crc kubenswrapper[4794]: I1215 14:22:04.998596 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:05 crc kubenswrapper[4794]: E1215 14:22:05.001051 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881\": container with ID starting with 46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881 not found: ID does not exist" containerID="46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.001103 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881"} err="failed to get container status \"46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881\": rpc error: code = NotFound desc = could not find container \"46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881\": container with ID starting with 46d7add4aecaa8f552291059bb369c23f734793c7e97959b7e43805bcdae5881 not found: ID does not exist" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.003304 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-kube-api-access-rzhnn" (OuterVolumeSpecName: "kube-api-access-rzhnn") pod "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" (UID: "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a"). InnerVolumeSpecName "kube-api-access-rzhnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.073722 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" (UID: "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.078341 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" (UID: "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.094365 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-config-data" (OuterVolumeSpecName: "config-data") pod "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" (UID: "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.094794 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" (UID: "9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.103655 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.103702 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.103713 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.103725 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzhnn\" (UniqueName: \"kubernetes.io/projected/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-kube-api-access-rzhnn\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.103736 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.343165 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.343471 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="ceilometer-central-agent" containerID="cri-o://99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56" gracePeriod=30 Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.343532 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="proxy-httpd" containerID="cri-o://d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d" gracePeriod=30 Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.343630 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="ceilometer-notification-agent" containerID="cri-o://397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99" gracePeriod=30 Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.343564 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="sg-core" containerID="cri-o://90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a" gracePeriod=30 Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.360982 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.382540 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2cb3-account-delete-cqxp6" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.403661 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.414707 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.418908 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2qgb\" (UniqueName: \"kubernetes.io/projected/d6c0aa1c-83fd-45e4-8213-d602734eaefd-kube-api-access-n2qgb\") pod \"d6c0aa1c-83fd-45e4-8213-d602734eaefd\" (UID: \"d6c0aa1c-83fd-45e4-8213-d602734eaefd\") " Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.421815 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c0aa1c-83fd-45e4-8213-d602734eaefd-kube-api-access-n2qgb" (OuterVolumeSpecName: "kube-api-access-n2qgb") pod "d6c0aa1c-83fd-45e4-8213-d602734eaefd" (UID: "d6c0aa1c-83fd-45e4-8213-d602734eaefd"). InnerVolumeSpecName "kube-api-access-n2qgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.521995 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2qgb\" (UniqueName: \"kubernetes.io/projected/d6c0aa1c-83fd-45e4-8213-d602734eaefd-kube-api-access-n2qgb\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.989251 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher2cb3-account-delete-cqxp6" event={"ID":"d6c0aa1c-83fd-45e4-8213-d602734eaefd","Type":"ContainerDied","Data":"cf04fc57a5e159677a7f01dfc780b6997cfdfae6dea43af8fc8d69ff403cac00"} Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.989311 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf04fc57a5e159677a7f01dfc780b6997cfdfae6dea43af8fc8d69ff403cac00" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.989308 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher2cb3-account-delete-cqxp6" Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.992439 4794 generic.go:334] "Generic (PLEG): container finished" podID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerID="d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d" exitCode=0 Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.992471 4794 generic.go:334] "Generic (PLEG): container finished" podID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerID="90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a" exitCode=2 Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.992480 4794 generic.go:334] "Generic (PLEG): container finished" podID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerID="99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56" exitCode=0 Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.992498 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c","Type":"ContainerDied","Data":"d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d"} Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.992520 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c","Type":"ContainerDied","Data":"90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a"} Dec 15 14:22:05 crc kubenswrapper[4794]: I1215 14:22:05.992533 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c","Type":"ContainerDied","Data":"99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56"} Dec 15 14:22:06 crc kubenswrapper[4794]: I1215 14:22:06.763596 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" path="/var/lib/kubelet/pods/9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a/volumes" Dec 15 14:22:06 crc kubenswrapper[4794]: I1215 14:22:06.944853 4794 scope.go:117] "RemoveContainer" containerID="8d62df96eb5f27da9a0c57c2145a0f3b9018b8526678106afcd4ff5a703fe513" Dec 15 14:22:07 crc kubenswrapper[4794]: I1215 14:22:07.737200 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:22:07 crc kubenswrapper[4794]: E1215 14:22:07.737852 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:22:07 crc kubenswrapper[4794]: I1215 14:22:07.839369 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-2cb3-account-create-mzgzx"] Dec 15 14:22:07 crc kubenswrapper[4794]: I1215 14:22:07.860421 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-2cb3-account-create-mzgzx"] Dec 15 14:22:07 crc kubenswrapper[4794]: I1215 14:22:07.870568 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-bffd4"] Dec 15 14:22:07 crc kubenswrapper[4794]: I1215 14:22:07.877696 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher2cb3-account-delete-cqxp6"] Dec 15 14:22:07 crc kubenswrapper[4794]: I1215 14:22:07.883152 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-bffd4"] Dec 15 14:22:07 crc kubenswrapper[4794]: I1215 14:22:07.888563 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher2cb3-account-delete-cqxp6"] Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.025220 4794 generic.go:334] "Generic (PLEG): container finished" podID="f3b646ec-0aff-43e4-97b2-991ca571ff83" containerID="95cac5963ffcb236b12be762058907ecdc7ffb281c371fa78894f9c17044cbf3" exitCode=0 Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.025272 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"f3b646ec-0aff-43e4-97b2-991ca571ff83","Type":"ContainerDied","Data":"95cac5963ffcb236b12be762058907ecdc7ffb281c371fa78894f9c17044cbf3"} Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.401564 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.468028 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3b646ec-0aff-43e4-97b2-991ca571ff83-logs\") pod \"f3b646ec-0aff-43e4-97b2-991ca571ff83\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.468168 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx7dl\" (UniqueName: \"kubernetes.io/projected/f3b646ec-0aff-43e4-97b2-991ca571ff83-kube-api-access-mx7dl\") pod \"f3b646ec-0aff-43e4-97b2-991ca571ff83\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.468218 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-config-data\") pod \"f3b646ec-0aff-43e4-97b2-991ca571ff83\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.468246 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-combined-ca-bundle\") pod \"f3b646ec-0aff-43e4-97b2-991ca571ff83\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.468648 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3b646ec-0aff-43e4-97b2-991ca571ff83-logs" (OuterVolumeSpecName: "logs") pod "f3b646ec-0aff-43e4-97b2-991ca571ff83" (UID: "f3b646ec-0aff-43e4-97b2-991ca571ff83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.469039 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-cert-memcached-mtls\") pod \"f3b646ec-0aff-43e4-97b2-991ca571ff83\" (UID: \"f3b646ec-0aff-43e4-97b2-991ca571ff83\") " Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.469431 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3b646ec-0aff-43e4-97b2-991ca571ff83-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.484393 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b646ec-0aff-43e4-97b2-991ca571ff83-kube-api-access-mx7dl" (OuterVolumeSpecName: "kube-api-access-mx7dl") pod "f3b646ec-0aff-43e4-97b2-991ca571ff83" (UID: "f3b646ec-0aff-43e4-97b2-991ca571ff83"). InnerVolumeSpecName "kube-api-access-mx7dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.493463 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3b646ec-0aff-43e4-97b2-991ca571ff83" (UID: "f3b646ec-0aff-43e4-97b2-991ca571ff83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.521043 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-config-data" (OuterVolumeSpecName: "config-data") pod "f3b646ec-0aff-43e4-97b2-991ca571ff83" (UID: "f3b646ec-0aff-43e4-97b2-991ca571ff83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.530668 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "f3b646ec-0aff-43e4-97b2-991ca571ff83" (UID: "f3b646ec-0aff-43e4-97b2-991ca571ff83"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.570222 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.570265 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx7dl\" (UniqueName: \"kubernetes.io/projected/f3b646ec-0aff-43e4-97b2-991ca571ff83-kube-api-access-mx7dl\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.570283 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.570296 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b646ec-0aff-43e4-97b2-991ca571ff83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.750249 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515cc0ff-9ae0-4219-81e4-2e36747c35e8" path="/var/lib/kubelet/pods/515cc0ff-9ae0-4219-81e4-2e36747c35e8/volumes" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.751271 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5601b23-626c-4218-9a46-bf326eeeab6b" path="/var/lib/kubelet/pods/d5601b23-626c-4218-9a46-bf326eeeab6b/volumes" Dec 15 14:22:08 crc kubenswrapper[4794]: I1215 14:22:08.752293 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6c0aa1c-83fd-45e4-8213-d602734eaefd" path="/var/lib/kubelet/pods/d6c0aa1c-83fd-45e4-8213-d602734eaefd/volumes" Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.039680 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"f3b646ec-0aff-43e4-97b2-991ca571ff83","Type":"ContainerDied","Data":"ecd213ac3f79e72278ded14723ff0c2da865e158aaa232c2b253f8e262a8e5d0"} Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.039763 4794 scope.go:117] "RemoveContainer" containerID="95cac5963ffcb236b12be762058907ecdc7ffb281c371fa78894f9c17044cbf3" Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.039844 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.072905 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.081048 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.914788 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.991985 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgmps\" (UniqueName: \"kubernetes.io/projected/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-kube-api-access-fgmps\") pod \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.992093 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-combined-ca-bundle\") pod \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.992126 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-config-data\") pod \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.992144 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-log-httpd\") pod \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.992188 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-sg-core-conf-yaml\") pod \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.992206 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-scripts\") pod \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.992254 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-run-httpd\") pod \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.992271 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-ceilometer-tls-certs\") pod \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\" (UID: \"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c\") " Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.993144 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" (UID: "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.993281 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" (UID: "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.998932 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-kube-api-access-fgmps" (OuterVolumeSpecName: "kube-api-access-fgmps") pod "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" (UID: "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c"). InnerVolumeSpecName "kube-api-access-fgmps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:09 crc kubenswrapper[4794]: I1215 14:22:09.999832 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-scripts" (OuterVolumeSpecName: "scripts") pod "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" (UID: "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.029826 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" (UID: "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.038146 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" (UID: "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.068249 4794 generic.go:334] "Generic (PLEG): container finished" podID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerID="397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99" exitCode=0 Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.068461 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c","Type":"ContainerDied","Data":"397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99"} Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.068654 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c","Type":"ContainerDied","Data":"a3fd2152eedd4456113f6b8285ae3f1ed7ae3d0609807a8e7530134a4ead9b05"} Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.068560 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.068727 4794 scope.go:117] "RemoveContainer" containerID="d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.091126 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-config-data" (OuterVolumeSpecName: "config-data") pod "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" (UID: "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.094906 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.094942 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.094952 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.094964 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.094978 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgmps\" (UniqueName: \"kubernetes.io/projected/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-kube-api-access-fgmps\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.094989 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.094997 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.105050 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" (UID: "ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.184868 4794 scope.go:117] "RemoveContainer" containerID="90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.198164 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.221781 4794 scope.go:117] "RemoveContainer" containerID="397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.262567 4794 scope.go:117] "RemoveContainer" containerID="99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.282692 4794 scope.go:117] "RemoveContainer" containerID="d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.283227 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d\": container with ID starting with d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d not found: ID does not exist" containerID="d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.283263 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d"} err="failed to get container status \"d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d\": rpc error: code = NotFound desc = could not find container \"d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d\": container with ID starting with d88ad167a7fe84a13871fac04bb1de4a4d4d71f4fa94a32e614ee8b84995e91d not found: ID does not exist" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.283307 4794 scope.go:117] "RemoveContainer" containerID="90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.283613 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a\": container with ID starting with 90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a not found: ID does not exist" containerID="90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.283665 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a"} err="failed to get container status \"90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a\": rpc error: code = NotFound desc = could not find container \"90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a\": container with ID starting with 90378ba6da4680614434491821c8478a218cc84a630f67030a2c1c99f450284a not found: ID does not exist" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.283701 4794 scope.go:117] "RemoveContainer" containerID="397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.284023 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99\": container with ID starting with 397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99 not found: ID does not exist" containerID="397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.284055 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99"} err="failed to get container status \"397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99\": rpc error: code = NotFound desc = could not find container \"397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99\": container with ID starting with 397868a1ac305b59d023bb0abe5e538fad342f76688400410ce6559372623f99 not found: ID does not exist" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.284083 4794 scope.go:117] "RemoveContainer" containerID="99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.284313 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56\": container with ID starting with 99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56 not found: ID does not exist" containerID="99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.284390 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56"} err="failed to get container status \"99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56\": rpc error: code = NotFound desc = could not find container \"99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56\": container with ID starting with 99d3834101ffed080e4eec9decec507ae388cf01b92c040ce3533b15df2b0e56 not found: ID does not exist" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.408437 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.421053 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430207 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.430536 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b646ec-0aff-43e4-97b2-991ca571ff83" containerName="watcher-applier" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430555 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b646ec-0aff-43e4-97b2-991ca571ff83" containerName="watcher-applier" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.430572 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="proxy-httpd" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430633 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="proxy-httpd" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.430641 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c0aa1c-83fd-45e4-8213-d602734eaefd" containerName="mariadb-account-delete" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430648 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c0aa1c-83fd-45e4-8213-d602734eaefd" containerName="mariadb-account-delete" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.430664 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c71f2d-c68f-4021-8a30-2ea58da786c5" containerName="watcher-kuttl-api-log" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430671 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c71f2d-c68f-4021-8a30-2ea58da786c5" containerName="watcher-kuttl-api-log" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.430679 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" containerName="watcher-decision-engine" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430686 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" containerName="watcher-decision-engine" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.430696 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="sg-core" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430702 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="sg-core" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.430720 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c71f2d-c68f-4021-8a30-2ea58da786c5" containerName="watcher-api" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430727 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c71f2d-c68f-4021-8a30-2ea58da786c5" containerName="watcher-api" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.430741 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="ceilometer-central-agent" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430748 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="ceilometer-central-agent" Dec 15 14:22:10 crc kubenswrapper[4794]: E1215 14:22:10.430761 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="ceilometer-notification-agent" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430769 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="ceilometer-notification-agent" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430943 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="ceilometer-notification-agent" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430953 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b646ec-0aff-43e4-97b2-991ca571ff83" containerName="watcher-applier" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430966 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="proxy-httpd" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430976 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0b86f5-c4a7-4f93-9cc9-1e544cf60e6a" containerName="watcher-decision-engine" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430988 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="sg-core" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.430998 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c0aa1c-83fd-45e4-8213-d602734eaefd" containerName="mariadb-account-delete" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.431006 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c71f2d-c68f-4021-8a30-2ea58da786c5" containerName="watcher-kuttl-api-log" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.431020 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" containerName="ceilometer-central-agent" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.431030 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c71f2d-c68f-4021-8a30-2ea58da786c5" containerName="watcher-api" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.432523 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.438369 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.438624 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.441913 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.449762 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.603968 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-config-data\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.604045 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.604100 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.604154 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.604185 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-log-httpd\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.604206 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-scripts\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.604301 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6spq\" (UniqueName: \"kubernetes.io/projected/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-kube-api-access-k6spq\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.604360 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-run-httpd\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.706290 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-log-httpd\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.706370 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-scripts\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.706415 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6spq\" (UniqueName: \"kubernetes.io/projected/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-kube-api-access-k6spq\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.706437 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-run-httpd\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.706521 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-config-data\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.706559 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.706610 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.706631 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.706839 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-log-httpd\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.706922 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-run-httpd\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.710945 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-scripts\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.711271 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.711346 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.712707 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.720949 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-config-data\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.725600 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6spq\" (UniqueName: \"kubernetes.io/projected/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-kube-api-access-k6spq\") pod \"ceilometer-0\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.748699 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.751542 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c" path="/var/lib/kubelet/pods/ccd3d109-0fdd-48eb-8b4b-cbaf1f57258c/volumes" Dec 15 14:22:10 crc kubenswrapper[4794]: I1215 14:22:10.752365 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3b646ec-0aff-43e4-97b2-991ca571ff83" path="/var/lib/kubelet/pods/f3b646ec-0aff-43e4-97b2-991ca571ff83/volumes" Dec 15 14:22:11 crc kubenswrapper[4794]: I1215 14:22:11.207441 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:12 crc kubenswrapper[4794]: I1215 14:22:12.090153 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c","Type":"ContainerStarted","Data":"736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6"} Dec 15 14:22:12 crc kubenswrapper[4794]: I1215 14:22:12.090495 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c","Type":"ContainerStarted","Data":"2f5c764712af362136f695591835c2105a27e584038fe1219becaf8a1efe1c7c"} Dec 15 14:22:12 crc kubenswrapper[4794]: I1215 14:22:12.151413 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-sg2tp"] Dec 15 14:22:12 crc kubenswrapper[4794]: I1215 14:22:12.152835 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-sg2tp" Dec 15 14:22:12 crc kubenswrapper[4794]: I1215 14:22:12.168783 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-sg2tp"] Dec 15 14:22:12 crc kubenswrapper[4794]: I1215 14:22:12.336051 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d77rk\" (UniqueName: \"kubernetes.io/projected/14db7f83-7c85-4926-a966-bad5401ac7c9-kube-api-access-d77rk\") pod \"watcher-db-create-sg2tp\" (UID: \"14db7f83-7c85-4926-a966-bad5401ac7c9\") " pod="watcher-kuttl-default/watcher-db-create-sg2tp" Dec 15 14:22:12 crc kubenswrapper[4794]: I1215 14:22:12.437891 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d77rk\" (UniqueName: \"kubernetes.io/projected/14db7f83-7c85-4926-a966-bad5401ac7c9-kube-api-access-d77rk\") pod \"watcher-db-create-sg2tp\" (UID: \"14db7f83-7c85-4926-a966-bad5401ac7c9\") " pod="watcher-kuttl-default/watcher-db-create-sg2tp" Dec 15 14:22:12 crc kubenswrapper[4794]: I1215 14:22:12.456514 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d77rk\" (UniqueName: \"kubernetes.io/projected/14db7f83-7c85-4926-a966-bad5401ac7c9-kube-api-access-d77rk\") pod \"watcher-db-create-sg2tp\" (UID: \"14db7f83-7c85-4926-a966-bad5401ac7c9\") " pod="watcher-kuttl-default/watcher-db-create-sg2tp" Dec 15 14:22:12 crc kubenswrapper[4794]: I1215 14:22:12.469132 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-sg2tp" Dec 15 14:22:12 crc kubenswrapper[4794]: I1215 14:22:12.930431 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-sg2tp"] Dec 15 14:22:13 crc kubenswrapper[4794]: I1215 14:22:13.099995 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-sg2tp" event={"ID":"14db7f83-7c85-4926-a966-bad5401ac7c9","Type":"ContainerStarted","Data":"81016abfcb438c6f01109a307bc9cf239c0d1d1009f7e66a13bff40c9cf2c029"} Dec 15 14:22:14 crc kubenswrapper[4794]: I1215 14:22:14.108447 4794 generic.go:334] "Generic (PLEG): container finished" podID="14db7f83-7c85-4926-a966-bad5401ac7c9" containerID="15b09cb321accff988017d1184ac1bec2e53f62bc8230fd60751c9336e69de5d" exitCode=0 Dec 15 14:22:14 crc kubenswrapper[4794]: I1215 14:22:14.108511 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-sg2tp" event={"ID":"14db7f83-7c85-4926-a966-bad5401ac7c9","Type":"ContainerDied","Data":"15b09cb321accff988017d1184ac1bec2e53f62bc8230fd60751c9336e69de5d"} Dec 15 14:22:14 crc kubenswrapper[4794]: I1215 14:22:14.111367 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c","Type":"ContainerStarted","Data":"41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5"} Dec 15 14:22:14 crc kubenswrapper[4794]: I1215 14:22:14.111404 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c","Type":"ContainerStarted","Data":"99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400"} Dec 15 14:22:15 crc kubenswrapper[4794]: I1215 14:22:15.122167 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c","Type":"ContainerStarted","Data":"6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23"} Dec 15 14:22:15 crc kubenswrapper[4794]: I1215 14:22:15.149426 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.5565798069999999 podStartE2EDuration="5.149400096s" podCreationTimestamp="2025-12-15 14:22:10 +0000 UTC" firstStartedPulling="2025-12-15 14:22:11.202929699 +0000 UTC m=+1693.054952157" lastFinishedPulling="2025-12-15 14:22:14.795750008 +0000 UTC m=+1696.647772446" observedRunningTime="2025-12-15 14:22:15.147336848 +0000 UTC m=+1696.999359286" watchObservedRunningTime="2025-12-15 14:22:15.149400096 +0000 UTC m=+1697.001422544" Dec 15 14:22:15 crc kubenswrapper[4794]: I1215 14:22:15.450950 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-sg2tp" Dec 15 14:22:15 crc kubenswrapper[4794]: I1215 14:22:15.597203 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d77rk\" (UniqueName: \"kubernetes.io/projected/14db7f83-7c85-4926-a966-bad5401ac7c9-kube-api-access-d77rk\") pod \"14db7f83-7c85-4926-a966-bad5401ac7c9\" (UID: \"14db7f83-7c85-4926-a966-bad5401ac7c9\") " Dec 15 14:22:15 crc kubenswrapper[4794]: I1215 14:22:15.602195 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14db7f83-7c85-4926-a966-bad5401ac7c9-kube-api-access-d77rk" (OuterVolumeSpecName: "kube-api-access-d77rk") pod "14db7f83-7c85-4926-a966-bad5401ac7c9" (UID: "14db7f83-7c85-4926-a966-bad5401ac7c9"). InnerVolumeSpecName "kube-api-access-d77rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:15 crc kubenswrapper[4794]: I1215 14:22:15.700454 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d77rk\" (UniqueName: \"kubernetes.io/projected/14db7f83-7c85-4926-a966-bad5401ac7c9-kube-api-access-d77rk\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:16 crc kubenswrapper[4794]: I1215 14:22:16.155063 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-sg2tp" Dec 15 14:22:16 crc kubenswrapper[4794]: I1215 14:22:16.156738 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-sg2tp" event={"ID":"14db7f83-7c85-4926-a966-bad5401ac7c9","Type":"ContainerDied","Data":"81016abfcb438c6f01109a307bc9cf239c0d1d1009f7e66a13bff40c9cf2c029"} Dec 15 14:22:16 crc kubenswrapper[4794]: I1215 14:22:16.156797 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81016abfcb438c6f01109a307bc9cf239c0d1d1009f7e66a13bff40c9cf2c029" Dec 15 14:22:16 crc kubenswrapper[4794]: I1215 14:22:16.156832 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:18 crc kubenswrapper[4794]: I1215 14:22:18.768263 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:22:18 crc kubenswrapper[4794]: E1215 14:22:18.768986 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:22:22 crc kubenswrapper[4794]: I1215 14:22:22.175513 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-b809-account-create-kqj5f"] Dec 15 14:22:22 crc kubenswrapper[4794]: E1215 14:22:22.176324 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14db7f83-7c85-4926-a966-bad5401ac7c9" containerName="mariadb-database-create" Dec 15 14:22:22 crc kubenswrapper[4794]: I1215 14:22:22.176341 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="14db7f83-7c85-4926-a966-bad5401ac7c9" containerName="mariadb-database-create" Dec 15 14:22:22 crc kubenswrapper[4794]: I1215 14:22:22.176551 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="14db7f83-7c85-4926-a966-bad5401ac7c9" containerName="mariadb-database-create" Dec 15 14:22:22 crc kubenswrapper[4794]: I1215 14:22:22.177282 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b809-account-create-kqj5f" Dec 15 14:22:22 crc kubenswrapper[4794]: I1215 14:22:22.179567 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 15 14:22:22 crc kubenswrapper[4794]: I1215 14:22:22.187182 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-b809-account-create-kqj5f"] Dec 15 14:22:22 crc kubenswrapper[4794]: I1215 14:22:22.216313 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxqfw\" (UniqueName: \"kubernetes.io/projected/d51a90b6-87a0-4521-8cd2-0689e1c35405-kube-api-access-rxqfw\") pod \"watcher-b809-account-create-kqj5f\" (UID: \"d51a90b6-87a0-4521-8cd2-0689e1c35405\") " pod="watcher-kuttl-default/watcher-b809-account-create-kqj5f" Dec 15 14:22:22 crc kubenswrapper[4794]: I1215 14:22:22.317504 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxqfw\" (UniqueName: \"kubernetes.io/projected/d51a90b6-87a0-4521-8cd2-0689e1c35405-kube-api-access-rxqfw\") pod \"watcher-b809-account-create-kqj5f\" (UID: \"d51a90b6-87a0-4521-8cd2-0689e1c35405\") " pod="watcher-kuttl-default/watcher-b809-account-create-kqj5f" Dec 15 14:22:22 crc kubenswrapper[4794]: I1215 14:22:22.337049 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxqfw\" (UniqueName: \"kubernetes.io/projected/d51a90b6-87a0-4521-8cd2-0689e1c35405-kube-api-access-rxqfw\") pod \"watcher-b809-account-create-kqj5f\" (UID: \"d51a90b6-87a0-4521-8cd2-0689e1c35405\") " pod="watcher-kuttl-default/watcher-b809-account-create-kqj5f" Dec 15 14:22:22 crc kubenswrapper[4794]: I1215 14:22:22.504235 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b809-account-create-kqj5f" Dec 15 14:22:22 crc kubenswrapper[4794]: I1215 14:22:22.960137 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-b809-account-create-kqj5f"] Dec 15 14:22:23 crc kubenswrapper[4794]: I1215 14:22:23.217606 4794 generic.go:334] "Generic (PLEG): container finished" podID="d51a90b6-87a0-4521-8cd2-0689e1c35405" containerID="decf9389ca33def0123c70849313cab03cd9dfa86fef8a7007d487f220b7b413" exitCode=0 Dec 15 14:22:23 crc kubenswrapper[4794]: I1215 14:22:23.217657 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-b809-account-create-kqj5f" event={"ID":"d51a90b6-87a0-4521-8cd2-0689e1c35405","Type":"ContainerDied","Data":"decf9389ca33def0123c70849313cab03cd9dfa86fef8a7007d487f220b7b413"} Dec 15 14:22:23 crc kubenswrapper[4794]: I1215 14:22:23.217931 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-b809-account-create-kqj5f" event={"ID":"d51a90b6-87a0-4521-8cd2-0689e1c35405","Type":"ContainerStarted","Data":"4df9359a94e0f17aa532027d2fdda2e2194ba4ef33f2b8aadfdea062729950d8"} Dec 15 14:22:24 crc kubenswrapper[4794]: I1215 14:22:24.571830 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b809-account-create-kqj5f" Dec 15 14:22:24 crc kubenswrapper[4794]: I1215 14:22:24.754242 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxqfw\" (UniqueName: \"kubernetes.io/projected/d51a90b6-87a0-4521-8cd2-0689e1c35405-kube-api-access-rxqfw\") pod \"d51a90b6-87a0-4521-8cd2-0689e1c35405\" (UID: \"d51a90b6-87a0-4521-8cd2-0689e1c35405\") " Dec 15 14:22:24 crc kubenswrapper[4794]: I1215 14:22:24.759612 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d51a90b6-87a0-4521-8cd2-0689e1c35405-kube-api-access-rxqfw" (OuterVolumeSpecName: "kube-api-access-rxqfw") pod "d51a90b6-87a0-4521-8cd2-0689e1c35405" (UID: "d51a90b6-87a0-4521-8cd2-0689e1c35405"). InnerVolumeSpecName "kube-api-access-rxqfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:24 crc kubenswrapper[4794]: I1215 14:22:24.856218 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxqfw\" (UniqueName: \"kubernetes.io/projected/d51a90b6-87a0-4521-8cd2-0689e1c35405-kube-api-access-rxqfw\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:25 crc kubenswrapper[4794]: I1215 14:22:25.235719 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-b809-account-create-kqj5f" event={"ID":"d51a90b6-87a0-4521-8cd2-0689e1c35405","Type":"ContainerDied","Data":"4df9359a94e0f17aa532027d2fdda2e2194ba4ef33f2b8aadfdea062729950d8"} Dec 15 14:22:25 crc kubenswrapper[4794]: I1215 14:22:25.235757 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4df9359a94e0f17aa532027d2fdda2e2194ba4ef33f2b8aadfdea062729950d8" Dec 15 14:22:25 crc kubenswrapper[4794]: I1215 14:22:25.235807 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b809-account-create-kqj5f" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.479987 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-42r2l"] Dec 15 14:22:27 crc kubenswrapper[4794]: E1215 14:22:27.480811 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d51a90b6-87a0-4521-8cd2-0689e1c35405" containerName="mariadb-account-create" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.480823 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51a90b6-87a0-4521-8cd2-0689e1c35405" containerName="mariadb-account-create" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.480983 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d51a90b6-87a0-4521-8cd2-0689e1c35405" containerName="mariadb-account-create" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.481483 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.484348 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.484553 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-89trk" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.503435 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-42r2l"] Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.602326 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-config-data\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.602383 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl5ch\" (UniqueName: \"kubernetes.io/projected/08415ca6-d8d9-479b-be5e-b2ff38e381d4-kube-api-access-pl5ch\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.602413 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.602503 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-db-sync-config-data\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.703744 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-config-data\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.703809 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl5ch\" (UniqueName: \"kubernetes.io/projected/08415ca6-d8d9-479b-be5e-b2ff38e381d4-kube-api-access-pl5ch\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.703851 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.703933 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-db-sync-config-data\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.709646 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.710948 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-db-sync-config-data\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.711151 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-config-data\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.723237 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl5ch\" (UniqueName: \"kubernetes.io/projected/08415ca6-d8d9-479b-be5e-b2ff38e381d4-kube-api-access-pl5ch\") pod \"watcher-kuttl-db-sync-42r2l\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:27 crc kubenswrapper[4794]: I1215 14:22:27.802977 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:28 crc kubenswrapper[4794]: I1215 14:22:28.284843 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-42r2l"] Dec 15 14:22:28 crc kubenswrapper[4794]: W1215 14:22:28.290423 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08415ca6_d8d9_479b_be5e_b2ff38e381d4.slice/crio-095a41852edfc294006e5c3a1867af98212770d46246db5d6c20581d5f3a4a45 WatchSource:0}: Error finding container 095a41852edfc294006e5c3a1867af98212770d46246db5d6c20581d5f3a4a45: Status 404 returned error can't find the container with id 095a41852edfc294006e5c3a1867af98212770d46246db5d6c20581d5f3a4a45 Dec 15 14:22:29 crc kubenswrapper[4794]: I1215 14:22:29.267825 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" event={"ID":"08415ca6-d8d9-479b-be5e-b2ff38e381d4","Type":"ContainerStarted","Data":"5b501d753f3d4c153a6bbf97d51a485c4d521a9c75ff9e1fd286362bf0ada23c"} Dec 15 14:22:29 crc kubenswrapper[4794]: I1215 14:22:29.269149 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" event={"ID":"08415ca6-d8d9-479b-be5e-b2ff38e381d4","Type":"ContainerStarted","Data":"095a41852edfc294006e5c3a1867af98212770d46246db5d6c20581d5f3a4a45"} Dec 15 14:22:29 crc kubenswrapper[4794]: I1215 14:22:29.292397 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" podStartSLOduration=2.292380648 podStartE2EDuration="2.292380648s" podCreationTimestamp="2025-12-15 14:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:22:29.286681066 +0000 UTC m=+1711.138703504" watchObservedRunningTime="2025-12-15 14:22:29.292380648 +0000 UTC m=+1711.144403076" Dec 15 14:22:31 crc kubenswrapper[4794]: I1215 14:22:31.299446 4794 generic.go:334] "Generic (PLEG): container finished" podID="08415ca6-d8d9-479b-be5e-b2ff38e381d4" containerID="5b501d753f3d4c153a6bbf97d51a485c4d521a9c75ff9e1fd286362bf0ada23c" exitCode=0 Dec 15 14:22:31 crc kubenswrapper[4794]: I1215 14:22:31.299511 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" event={"ID":"08415ca6-d8d9-479b-be5e-b2ff38e381d4","Type":"ContainerDied","Data":"5b501d753f3d4c153a6bbf97d51a485c4d521a9c75ff9e1fd286362bf0ada23c"} Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.673353 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.736939 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:22:32 crc kubenswrapper[4794]: E1215 14:22:32.737304 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.786802 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-combined-ca-bundle\") pod \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.786844 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-config-data\") pod \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.787024 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-db-sync-config-data\") pod \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.787118 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl5ch\" (UniqueName: \"kubernetes.io/projected/08415ca6-d8d9-479b-be5e-b2ff38e381d4-kube-api-access-pl5ch\") pod \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\" (UID: \"08415ca6-d8d9-479b-be5e-b2ff38e381d4\") " Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.799888 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "08415ca6-d8d9-479b-be5e-b2ff38e381d4" (UID: "08415ca6-d8d9-479b-be5e-b2ff38e381d4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.810346 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08415ca6-d8d9-479b-be5e-b2ff38e381d4-kube-api-access-pl5ch" (OuterVolumeSpecName: "kube-api-access-pl5ch") pod "08415ca6-d8d9-479b-be5e-b2ff38e381d4" (UID: "08415ca6-d8d9-479b-be5e-b2ff38e381d4"). InnerVolumeSpecName "kube-api-access-pl5ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.821092 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08415ca6-d8d9-479b-be5e-b2ff38e381d4" (UID: "08415ca6-d8d9-479b-be5e-b2ff38e381d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.833313 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-config-data" (OuterVolumeSpecName: "config-data") pod "08415ca6-d8d9-479b-be5e-b2ff38e381d4" (UID: "08415ca6-d8d9-479b-be5e-b2ff38e381d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.889645 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl5ch\" (UniqueName: \"kubernetes.io/projected/08415ca6-d8d9-479b-be5e-b2ff38e381d4-kube-api-access-pl5ch\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.889681 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.889696 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:32 crc kubenswrapper[4794]: I1215 14:22:32.889707 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08415ca6-d8d9-479b-be5e-b2ff38e381d4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.321508 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" event={"ID":"08415ca6-d8d9-479b-be5e-b2ff38e381d4","Type":"ContainerDied","Data":"095a41852edfc294006e5c3a1867af98212770d46246db5d6c20581d5f3a4a45"} Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.321809 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="095a41852edfc294006e5c3a1867af98212770d46246db5d6c20581d5f3a4a45" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.321574 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-42r2l" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.585821 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:22:33 crc kubenswrapper[4794]: E1215 14:22:33.586222 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08415ca6-d8d9-479b-be5e-b2ff38e381d4" containerName="watcher-kuttl-db-sync" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.586243 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="08415ca6-d8d9-479b-be5e-b2ff38e381d4" containerName="watcher-kuttl-db-sync" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.586457 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="08415ca6-d8d9-479b-be5e-b2ff38e381d4" containerName="watcher-kuttl-db-sync" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.587504 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.592787 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.593037 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-89trk" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.603527 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.614254 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.615496 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.620918 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.628738 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.697513 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.701472 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702186 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702224 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww7q8\" (UniqueName: \"kubernetes.io/projected/5432b186-d38f-40b5-9979-8d197844a905-kube-api-access-ww7q8\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702258 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9eb4ad7-832e-40ab-a921-f27b230508d6-logs\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702317 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcvsp\" (UniqueName: \"kubernetes.io/projected/e9eb4ad7-832e-40ab-a921-f27b230508d6-kube-api-access-jcvsp\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702368 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702423 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702455 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702474 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702507 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5432b186-d38f-40b5-9979-8d197844a905-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702542 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702609 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.702673 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.704087 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.707266 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803632 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803676 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803719 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803743 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803758 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803774 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803790 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803815 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5432b186-d38f-40b5-9979-8d197844a905-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803836 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803858 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803877 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803914 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803940 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803958 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww7q8\" (UniqueName: \"kubernetes.io/projected/5432b186-d38f-40b5-9979-8d197844a905-kube-api-access-ww7q8\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.803981 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhfdv\" (UniqueName: \"kubernetes.io/projected/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-kube-api-access-mhfdv\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.804007 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9eb4ad7-832e-40ab-a921-f27b230508d6-logs\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.804027 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcvsp\" (UniqueName: \"kubernetes.io/projected/e9eb4ad7-832e-40ab-a921-f27b230508d6-kube-api-access-jcvsp\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.804743 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5432b186-d38f-40b5-9979-8d197844a905-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.807183 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9eb4ad7-832e-40ab-a921-f27b230508d6-logs\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.810628 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.810722 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.810880 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.811324 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.813085 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.813825 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.820025 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.822277 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcvsp\" (UniqueName: \"kubernetes.io/projected/e9eb4ad7-832e-40ab-a921-f27b230508d6-kube-api-access-jcvsp\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.825028 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.831075 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww7q8\" (UniqueName: \"kubernetes.io/projected/5432b186-d38f-40b5-9979-8d197844a905-kube-api-access-ww7q8\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.902844 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.905851 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhfdv\" (UniqueName: \"kubernetes.io/projected/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-kube-api-access-mhfdv\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.906008 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.906179 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.906286 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.906406 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.906499 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.910677 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.910784 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.911525 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.929651 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhfdv\" (UniqueName: \"kubernetes.io/projected/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-kube-api-access-mhfdv\") pod \"watcher-kuttl-applier-0\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:33 crc kubenswrapper[4794]: I1215 14:22:33.958289 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:34 crc kubenswrapper[4794]: I1215 14:22:34.019175 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:34 crc kubenswrapper[4794]: I1215 14:22:34.440933 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:22:34 crc kubenswrapper[4794]: W1215 14:22:34.457051 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9eb4ad7_832e_40ab_a921_f27b230508d6.slice/crio-56ccc6fa6937a1596586de9120b064a682cc48cc5672cf52b03e13b5cd61c3af WatchSource:0}: Error finding container 56ccc6fa6937a1596586de9120b064a682cc48cc5672cf52b03e13b5cd61c3af: Status 404 returned error can't find the container with id 56ccc6fa6937a1596586de9120b064a682cc48cc5672cf52b03e13b5cd61c3af Dec 15 14:22:34 crc kubenswrapper[4794]: I1215 14:22:34.518353 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:22:34 crc kubenswrapper[4794]: I1215 14:22:34.633353 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:22:35 crc kubenswrapper[4794]: I1215 14:22:35.337362 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"04bdfdbc-2a4d-4081-9230-c0307b4cca8c","Type":"ContainerStarted","Data":"fc1915a2c9d64249d0403fc3d3360dfd376267238a8342e02c2e8c7e1beaf88e"} Dec 15 14:22:35 crc kubenswrapper[4794]: I1215 14:22:35.337750 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"04bdfdbc-2a4d-4081-9230-c0307b4cca8c","Type":"ContainerStarted","Data":"27ed47058025b5ffce7a60f8657a938815f1f0bdb1920079e3e71f7da0034a78"} Dec 15 14:22:35 crc kubenswrapper[4794]: I1215 14:22:35.339535 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"5432b186-d38f-40b5-9979-8d197844a905","Type":"ContainerStarted","Data":"5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1"} Dec 15 14:22:35 crc kubenswrapper[4794]: I1215 14:22:35.339568 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"5432b186-d38f-40b5-9979-8d197844a905","Type":"ContainerStarted","Data":"31512775badb369fdf71a901bf72b575e292ed8289eff0440ab0a577507b757e"} Dec 15 14:22:35 crc kubenswrapper[4794]: I1215 14:22:35.341446 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e9eb4ad7-832e-40ab-a921-f27b230508d6","Type":"ContainerStarted","Data":"59443a897f6786ee1d6df85c2f05a38fca1b98c88d7bc6f4ad5ee4da01581962"} Dec 15 14:22:35 crc kubenswrapper[4794]: I1215 14:22:35.341522 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e9eb4ad7-832e-40ab-a921-f27b230508d6","Type":"ContainerStarted","Data":"166a2f319423ec5d991e29434c83f2b1a9286046f8bec2da27ccee4e674d4c8e"} Dec 15 14:22:35 crc kubenswrapper[4794]: I1215 14:22:35.341542 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e9eb4ad7-832e-40ab-a921-f27b230508d6","Type":"ContainerStarted","Data":"56ccc6fa6937a1596586de9120b064a682cc48cc5672cf52b03e13b5cd61c3af"} Dec 15 14:22:35 crc kubenswrapper[4794]: I1215 14:22:35.341620 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:35 crc kubenswrapper[4794]: I1215 14:22:35.401780 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.401756135 podStartE2EDuration="2.401756135s" podCreationTimestamp="2025-12-15 14:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:22:35.369308546 +0000 UTC m=+1717.221330984" watchObservedRunningTime="2025-12-15 14:22:35.401756135 +0000 UTC m=+1717.253778573" Dec 15 14:22:35 crc kubenswrapper[4794]: I1215 14:22:35.425262 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.425246319 podStartE2EDuration="2.425246319s" podCreationTimestamp="2025-12-15 14:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:22:35.404336578 +0000 UTC m=+1717.256359046" watchObservedRunningTime="2025-12-15 14:22:35.425246319 +0000 UTC m=+1717.277268757" Dec 15 14:22:35 crc kubenswrapper[4794]: I1215 14:22:35.429353 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.429341955 podStartE2EDuration="2.429341955s" podCreationTimestamp="2025-12-15 14:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:22:35.422356678 +0000 UTC m=+1717.274379116" watchObservedRunningTime="2025-12-15 14:22:35.429341955 +0000 UTC m=+1717.281364393" Dec 15 14:22:36 crc kubenswrapper[4794]: I1215 14:22:36.134933 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:37 crc kubenswrapper[4794]: I1215 14:22:37.300740 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:37 crc kubenswrapper[4794]: I1215 14:22:37.700972 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:38 crc kubenswrapper[4794]: I1215 14:22:38.486413 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:38 crc kubenswrapper[4794]: I1215 14:22:38.903654 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:39 crc kubenswrapper[4794]: I1215 14:22:39.020187 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:39 crc kubenswrapper[4794]: I1215 14:22:39.667607 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:40 crc kubenswrapper[4794]: I1215 14:22:40.756048 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:40 crc kubenswrapper[4794]: I1215 14:22:40.850549 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:42 crc kubenswrapper[4794]: I1215 14:22:42.075812 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:43 crc kubenswrapper[4794]: I1215 14:22:43.260254 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:43 crc kubenswrapper[4794]: I1215 14:22:43.903613 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:43 crc kubenswrapper[4794]: I1215 14:22:43.909718 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:43 crc kubenswrapper[4794]: I1215 14:22:43.958827 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:43 crc kubenswrapper[4794]: I1215 14:22:43.988768 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:44 crc kubenswrapper[4794]: I1215 14:22:44.020244 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:44 crc kubenswrapper[4794]: I1215 14:22:44.044344 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:44 crc kubenswrapper[4794]: I1215 14:22:44.442678 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:44 crc kubenswrapper[4794]: I1215 14:22:44.448029 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:22:44 crc kubenswrapper[4794]: I1215 14:22:44.452098 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:44 crc kubenswrapper[4794]: I1215 14:22:44.483001 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:22:44 crc kubenswrapper[4794]: I1215 14:22:44.493889 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:22:45 crc kubenswrapper[4794]: I1215 14:22:45.623859 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:45 crc kubenswrapper[4794]: I1215 14:22:45.954314 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.372473 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-db-create-kkqg6"] Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.374061 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-kkqg6" Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.379734 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-create-kkqg6"] Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.576083 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6rfn\" (UniqueName: \"kubernetes.io/projected/5551032f-6488-45e4-a75e-6895f05c408b-kube-api-access-s6rfn\") pod \"cinder-db-create-kkqg6\" (UID: \"5551032f-6488-45e4-a75e-6895f05c408b\") " pod="watcher-kuttl-default/cinder-db-create-kkqg6" Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.677267 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6rfn\" (UniqueName: \"kubernetes.io/projected/5551032f-6488-45e4-a75e-6895f05c408b-kube-api-access-s6rfn\") pod \"cinder-db-create-kkqg6\" (UID: \"5551032f-6488-45e4-a75e-6895f05c408b\") " pod="watcher-kuttl-default/cinder-db-create-kkqg6" Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.689378 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.690060 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="sg-core" containerID="cri-o://41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5" gracePeriod=30 Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.690117 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="ceilometer-notification-agent" containerID="cri-o://99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400" gracePeriod=30 Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.690146 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="proxy-httpd" containerID="cri-o://6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23" gracePeriod=30 Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.689952 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="ceilometer-central-agent" containerID="cri-o://736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6" gracePeriod=30 Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.701808 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6rfn\" (UniqueName: \"kubernetes.io/projected/5551032f-6488-45e4-a75e-6895f05c408b-kube-api-access-s6rfn\") pod \"cinder-db-create-kkqg6\" (UID: \"5551032f-6488-45e4-a75e-6895f05c408b\") " pod="watcher-kuttl-default/cinder-db-create-kkqg6" Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.717440 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-kkqg6" Dec 15 14:22:46 crc kubenswrapper[4794]: I1215 14:22:46.737607 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:22:46 crc kubenswrapper[4794]: E1215 14:22:46.737921 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:22:47 crc kubenswrapper[4794]: I1215 14:22:47.134842 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:47 crc kubenswrapper[4794]: I1215 14:22:47.188671 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-create-kkqg6"] Dec 15 14:22:47 crc kubenswrapper[4794]: I1215 14:22:47.466349 4794 generic.go:334] "Generic (PLEG): container finished" podID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerID="6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23" exitCode=0 Dec 15 14:22:47 crc kubenswrapper[4794]: I1215 14:22:47.466684 4794 generic.go:334] "Generic (PLEG): container finished" podID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerID="41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5" exitCode=2 Dec 15 14:22:47 crc kubenswrapper[4794]: I1215 14:22:47.466703 4794 generic.go:334] "Generic (PLEG): container finished" podID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerID="736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6" exitCode=0 Dec 15 14:22:47 crc kubenswrapper[4794]: I1215 14:22:47.466405 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c","Type":"ContainerDied","Data":"6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23"} Dec 15 14:22:47 crc kubenswrapper[4794]: I1215 14:22:47.466788 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c","Type":"ContainerDied","Data":"41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5"} Dec 15 14:22:47 crc kubenswrapper[4794]: I1215 14:22:47.466807 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c","Type":"ContainerDied","Data":"736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6"} Dec 15 14:22:47 crc kubenswrapper[4794]: I1215 14:22:47.468977 4794 generic.go:334] "Generic (PLEG): container finished" podID="5551032f-6488-45e4-a75e-6895f05c408b" containerID="d3449d632eeb3ea108bf9350932219ba67827e47161b8fb69e142ecb881cd9ab" exitCode=0 Dec 15 14:22:47 crc kubenswrapper[4794]: I1215 14:22:47.469015 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-kkqg6" event={"ID":"5551032f-6488-45e4-a75e-6895f05c408b","Type":"ContainerDied","Data":"d3449d632eeb3ea108bf9350932219ba67827e47161b8fb69e142ecb881cd9ab"} Dec 15 14:22:47 crc kubenswrapper[4794]: I1215 14:22:47.469068 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-kkqg6" event={"ID":"5551032f-6488-45e4-a75e-6895f05c408b","Type":"ContainerStarted","Data":"20280c86ef98c856317d25e3375a8c7a302254eb6bdd8dfbd5ce705db17d8d04"} Dec 15 14:22:48 crc kubenswrapper[4794]: I1215 14:22:48.342563 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:48 crc kubenswrapper[4794]: I1215 14:22:48.998571 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-kkqg6" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.120103 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6rfn\" (UniqueName: \"kubernetes.io/projected/5551032f-6488-45e4-a75e-6895f05c408b-kube-api-access-s6rfn\") pod \"5551032f-6488-45e4-a75e-6895f05c408b\" (UID: \"5551032f-6488-45e4-a75e-6895f05c408b\") " Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.156056 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5551032f-6488-45e4-a75e-6895f05c408b-kube-api-access-s6rfn" (OuterVolumeSpecName: "kube-api-access-s6rfn") pod "5551032f-6488-45e4-a75e-6895f05c408b" (UID: "5551032f-6488-45e4-a75e-6895f05c408b"). InnerVolumeSpecName "kube-api-access-s6rfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.222314 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6rfn\" (UniqueName: \"kubernetes.io/projected/5551032f-6488-45e4-a75e-6895f05c408b-kube-api-access-s6rfn\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.306524 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.327050 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-run-httpd\") pod \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.327136 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-sg-core-conf-yaml\") pod \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.327200 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-log-httpd\") pod \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.327308 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-ceilometer-tls-certs\") pod \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.327338 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-scripts\") pod \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.327384 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-combined-ca-bundle\") pod \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.327408 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6spq\" (UniqueName: \"kubernetes.io/projected/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-kube-api-access-k6spq\") pod \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.327459 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" (UID: "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.327512 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-config-data\") pod \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\" (UID: \"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c\") " Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.327757 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" (UID: "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.328259 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.328281 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.342920 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-kube-api-access-k6spq" (OuterVolumeSpecName: "kube-api-access-k6spq") pod "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" (UID: "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c"). InnerVolumeSpecName "kube-api-access-k6spq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.344264 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-scripts" (OuterVolumeSpecName: "scripts") pod "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" (UID: "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.382772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" (UID: "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.430557 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.430603 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.430613 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6spq\" (UniqueName: \"kubernetes.io/projected/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-kube-api-access-k6spq\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.485056 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" (UID: "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.537130 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.544804 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" (UID: "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.545540 4794 generic.go:334] "Generic (PLEG): container finished" podID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerID="99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400" exitCode=0 Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.545698 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.545761 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c","Type":"ContainerDied","Data":"99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400"} Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.545826 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c","Type":"ContainerDied","Data":"2f5c764712af362136f695591835c2105a27e584038fe1219becaf8a1efe1c7c"} Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.545850 4794 scope.go:117] "RemoveContainer" containerID="6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.551890 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-config-data" (OuterVolumeSpecName: "config-data") pod "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" (UID: "4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.552018 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-kkqg6" event={"ID":"5551032f-6488-45e4-a75e-6895f05c408b","Type":"ContainerDied","Data":"20280c86ef98c856317d25e3375a8c7a302254eb6bdd8dfbd5ce705db17d8d04"} Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.552054 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20280c86ef98c856317d25e3375a8c7a302254eb6bdd8dfbd5ce705db17d8d04" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.552109 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-kkqg6" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.556175 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.564113 4794 scope.go:117] "RemoveContainer" containerID="41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.597786 4794 scope.go:117] "RemoveContainer" containerID="99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.628074 4794 scope.go:117] "RemoveContainer" containerID="736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.640965 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.641040 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.661063 4794 scope.go:117] "RemoveContainer" containerID="6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23" Dec 15 14:22:49 crc kubenswrapper[4794]: E1215 14:22:49.661371 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23\": container with ID starting with 6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23 not found: ID does not exist" containerID="6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.661401 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23"} err="failed to get container status \"6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23\": rpc error: code = NotFound desc = could not find container \"6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23\": container with ID starting with 6e4189db4eabf8a05da957ed80f920aec94a6937824cdf08494f25a871ab5b23 not found: ID does not exist" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.661422 4794 scope.go:117] "RemoveContainer" containerID="41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5" Dec 15 14:22:49 crc kubenswrapper[4794]: E1215 14:22:49.661746 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5\": container with ID starting with 41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5 not found: ID does not exist" containerID="41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.661768 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5"} err="failed to get container status \"41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5\": rpc error: code = NotFound desc = could not find container \"41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5\": container with ID starting with 41de20e2a415420f56857b655d31e42588020971c7816c4d579ecf68a95196d5 not found: ID does not exist" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.661781 4794 scope.go:117] "RemoveContainer" containerID="99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400" Dec 15 14:22:49 crc kubenswrapper[4794]: E1215 14:22:49.662190 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400\": container with ID starting with 99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400 not found: ID does not exist" containerID="99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.662210 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400"} err="failed to get container status \"99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400\": rpc error: code = NotFound desc = could not find container \"99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400\": container with ID starting with 99f9279e81f698c9207ba161a799018d0594532aa9bbc794694f81c355666400 not found: ID does not exist" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.662222 4794 scope.go:117] "RemoveContainer" containerID="736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6" Dec 15 14:22:49 crc kubenswrapper[4794]: E1215 14:22:49.662402 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6\": container with ID starting with 736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6 not found: ID does not exist" containerID="736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.662419 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6"} err="failed to get container status \"736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6\": rpc error: code = NotFound desc = could not find container \"736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6\": container with ID starting with 736c7d24a26e92abcbfbda4ff93080117662463a2e48d124cce580f873aba3c6 not found: ID does not exist" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.965698 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.974656 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.988822 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:49 crc kubenswrapper[4794]: E1215 14:22:49.989231 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="proxy-httpd" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.989254 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="proxy-httpd" Dec 15 14:22:49 crc kubenswrapper[4794]: E1215 14:22:49.989265 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5551032f-6488-45e4-a75e-6895f05c408b" containerName="mariadb-database-create" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.989274 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5551032f-6488-45e4-a75e-6895f05c408b" containerName="mariadb-database-create" Dec 15 14:22:49 crc kubenswrapper[4794]: E1215 14:22:49.989288 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="sg-core" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.989296 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="sg-core" Dec 15 14:22:49 crc kubenswrapper[4794]: E1215 14:22:49.989311 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="ceilometer-central-agent" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.989319 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="ceilometer-central-agent" Dec 15 14:22:49 crc kubenswrapper[4794]: E1215 14:22:49.989332 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="ceilometer-notification-agent" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.989341 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="ceilometer-notification-agent" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.989560 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="ceilometer-notification-agent" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.989575 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="sg-core" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.989608 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="ceilometer-central-agent" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.989627 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" containerName="proxy-httpd" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.989643 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5551032f-6488-45e4-a75e-6895f05c408b" containerName="mariadb-database-create" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.991561 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.994501 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.998755 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:22:49 crc kubenswrapper[4794]: I1215 14:22:49.999360 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.008188 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.149151 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-config-data\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.149196 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-log-httpd\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.149304 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-run-httpd\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.149416 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-scripts\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.149482 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.149526 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4ttt\" (UniqueName: \"kubernetes.io/projected/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-kube-api-access-f4ttt\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.149594 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.149629 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.250782 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4ttt\" (UniqueName: \"kubernetes.io/projected/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-kube-api-access-f4ttt\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.250840 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.250864 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.250917 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-config-data\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.250949 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-log-httpd\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.250988 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-run-httpd\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.251027 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-scripts\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.251058 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.251790 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-log-httpd\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.251893 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-run-httpd\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.256488 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.256489 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-scripts\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.257628 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-config-data\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.257919 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.258307 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.277648 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4ttt\" (UniqueName: \"kubernetes.io/projected/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-kube-api-access-f4ttt\") pod \"ceilometer-0\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.314478 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.734301 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:50 crc kubenswrapper[4794]: W1215 14:22:50.745734 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce95d711_ab01_44f6_8a2b_dcf1b4b43f76.slice/crio-5663a1fd1106d7d84e3342dd7b0079e55865ae32658de94421c574a4802e7d73 WatchSource:0}: Error finding container 5663a1fd1106d7d84e3342dd7b0079e55865ae32658de94421c574a4802e7d73: Status 404 returned error can't find the container with id 5663a1fd1106d7d84e3342dd7b0079e55865ae32658de94421c574a4802e7d73 Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.749520 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c" path="/var/lib/kubelet/pods/4a1f24b0-81f4-45ed-97b9-3bc4bf7e816c/volumes" Dec 15 14:22:50 crc kubenswrapper[4794]: I1215 14:22:50.750657 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:22:51 crc kubenswrapper[4794]: I1215 14:22:51.579223 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76","Type":"ContainerStarted","Data":"5663a1fd1106d7d84e3342dd7b0079e55865ae32658de94421c574a4802e7d73"} Dec 15 14:22:51 crc kubenswrapper[4794]: I1215 14:22:51.911749 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:52 crc kubenswrapper[4794]: I1215 14:22:52.590513 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76","Type":"ContainerStarted","Data":"c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73"} Dec 15 14:22:52 crc kubenswrapper[4794]: I1215 14:22:52.590563 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76","Type":"ContainerStarted","Data":"29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866"} Dec 15 14:22:53 crc kubenswrapper[4794]: I1215 14:22:53.087575 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:53 crc kubenswrapper[4794]: I1215 14:22:53.600716 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76","Type":"ContainerStarted","Data":"bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1"} Dec 15 14:22:54 crc kubenswrapper[4794]: I1215 14:22:54.313714 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:54 crc kubenswrapper[4794]: I1215 14:22:54.612539 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76","Type":"ContainerStarted","Data":"b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b"} Dec 15 14:22:54 crc kubenswrapper[4794]: I1215 14:22:54.613068 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:22:54 crc kubenswrapper[4794]: I1215 14:22:54.643165 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.052969721 podStartE2EDuration="5.643141845s" podCreationTimestamp="2025-12-15 14:22:49 +0000 UTC" firstStartedPulling="2025-12-15 14:22:50.748834384 +0000 UTC m=+1732.600856822" lastFinishedPulling="2025-12-15 14:22:54.339006498 +0000 UTC m=+1736.191028946" observedRunningTime="2025-12-15 14:22:54.638291898 +0000 UTC m=+1736.490314336" watchObservedRunningTime="2025-12-15 14:22:54.643141845 +0000 UTC m=+1736.495164293" Dec 15 14:22:55 crc kubenswrapper[4794]: I1215 14:22:55.517118 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:56 crc kubenswrapper[4794]: I1215 14:22:56.389365 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-2114-account-create-9w88k"] Dec 15 14:22:56 crc kubenswrapper[4794]: I1215 14:22:56.391091 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-2114-account-create-9w88k" Dec 15 14:22:56 crc kubenswrapper[4794]: I1215 14:22:56.393276 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-db-secret" Dec 15 14:22:56 crc kubenswrapper[4794]: I1215 14:22:56.404147 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-2114-account-create-9w88k"] Dec 15 14:22:56 crc kubenswrapper[4794]: I1215 14:22:56.550381 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5c6g\" (UniqueName: \"kubernetes.io/projected/e6c5bd48-31cd-4fa1-88c0-9afb783929eb-kube-api-access-z5c6g\") pod \"cinder-2114-account-create-9w88k\" (UID: \"e6c5bd48-31cd-4fa1-88c0-9afb783929eb\") " pod="watcher-kuttl-default/cinder-2114-account-create-9w88k" Dec 15 14:22:56 crc kubenswrapper[4794]: I1215 14:22:56.651515 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5c6g\" (UniqueName: \"kubernetes.io/projected/e6c5bd48-31cd-4fa1-88c0-9afb783929eb-kube-api-access-z5c6g\") pod \"cinder-2114-account-create-9w88k\" (UID: \"e6c5bd48-31cd-4fa1-88c0-9afb783929eb\") " pod="watcher-kuttl-default/cinder-2114-account-create-9w88k" Dec 15 14:22:56 crc kubenswrapper[4794]: I1215 14:22:56.672879 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5c6g\" (UniqueName: \"kubernetes.io/projected/e6c5bd48-31cd-4fa1-88c0-9afb783929eb-kube-api-access-z5c6g\") pod \"cinder-2114-account-create-9w88k\" (UID: \"e6c5bd48-31cd-4fa1-88c0-9afb783929eb\") " pod="watcher-kuttl-default/cinder-2114-account-create-9w88k" Dec 15 14:22:56 crc kubenswrapper[4794]: I1215 14:22:56.679286 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:56 crc kubenswrapper[4794]: I1215 14:22:56.715313 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-2114-account-create-9w88k" Dec 15 14:22:57 crc kubenswrapper[4794]: I1215 14:22:57.205904 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-2114-account-create-9w88k"] Dec 15 14:22:57 crc kubenswrapper[4794]: I1215 14:22:57.636515 4794 generic.go:334] "Generic (PLEG): container finished" podID="e6c5bd48-31cd-4fa1-88c0-9afb783929eb" containerID="8e61b9bb8b8adecb2f87a5474c5e72e346b00c4bb0374303539e2979244d2b2e" exitCode=0 Dec 15 14:22:57 crc kubenswrapper[4794]: I1215 14:22:57.636903 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-2114-account-create-9w88k" event={"ID":"e6c5bd48-31cd-4fa1-88c0-9afb783929eb","Type":"ContainerDied","Data":"8e61b9bb8b8adecb2f87a5474c5e72e346b00c4bb0374303539e2979244d2b2e"} Dec 15 14:22:57 crc kubenswrapper[4794]: I1215 14:22:57.636935 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-2114-account-create-9w88k" event={"ID":"e6c5bd48-31cd-4fa1-88c0-9afb783929eb","Type":"ContainerStarted","Data":"11f4b2c20f81e00b7613746b88ca8fa9fa23f267e65a2493f327d077a47a52b9"} Dec 15 14:22:57 crc kubenswrapper[4794]: I1215 14:22:57.737249 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:22:57 crc kubenswrapper[4794]: E1215 14:22:57.737510 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:22:57 crc kubenswrapper[4794]: I1215 14:22:57.921141 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:59 crc kubenswrapper[4794]: I1215 14:22:59.048452 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-2114-account-create-9w88k" Dec 15 14:22:59 crc kubenswrapper[4794]: I1215 14:22:59.135196 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:22:59 crc kubenswrapper[4794]: I1215 14:22:59.209358 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5c6g\" (UniqueName: \"kubernetes.io/projected/e6c5bd48-31cd-4fa1-88c0-9afb783929eb-kube-api-access-z5c6g\") pod \"e6c5bd48-31cd-4fa1-88c0-9afb783929eb\" (UID: \"e6c5bd48-31cd-4fa1-88c0-9afb783929eb\") " Dec 15 14:22:59 crc kubenswrapper[4794]: I1215 14:22:59.222818 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c5bd48-31cd-4fa1-88c0-9afb783929eb-kube-api-access-z5c6g" (OuterVolumeSpecName: "kube-api-access-z5c6g") pod "e6c5bd48-31cd-4fa1-88c0-9afb783929eb" (UID: "e6c5bd48-31cd-4fa1-88c0-9afb783929eb"). InnerVolumeSpecName "kube-api-access-z5c6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:22:59 crc kubenswrapper[4794]: I1215 14:22:59.311417 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5c6g\" (UniqueName: \"kubernetes.io/projected/e6c5bd48-31cd-4fa1-88c0-9afb783929eb-kube-api-access-z5c6g\") on node \"crc\" DevicePath \"\"" Dec 15 14:22:59 crc kubenswrapper[4794]: I1215 14:22:59.656270 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-2114-account-create-9w88k" event={"ID":"e6c5bd48-31cd-4fa1-88c0-9afb783929eb","Type":"ContainerDied","Data":"11f4b2c20f81e00b7613746b88ca8fa9fa23f267e65a2493f327d077a47a52b9"} Dec 15 14:22:59 crc kubenswrapper[4794]: I1215 14:22:59.656304 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11f4b2c20f81e00b7613746b88ca8fa9fa23f267e65a2493f327d077a47a52b9" Dec 15 14:22:59 crc kubenswrapper[4794]: I1215 14:22:59.656338 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-2114-account-create-9w88k" Dec 15 14:23:00 crc kubenswrapper[4794]: I1215 14:23:00.301446 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.475452 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.544343 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-db-sync-wl49x"] Dec 15 14:23:01 crc kubenswrapper[4794]: E1215 14:23:01.544933 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c5bd48-31cd-4fa1-88c0-9afb783929eb" containerName="mariadb-account-create" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.545040 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c5bd48-31cd-4fa1-88c0-9afb783929eb" containerName="mariadb-account-create" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.545318 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c5bd48-31cd-4fa1-88c0-9afb783929eb" containerName="mariadb-account-create" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.546310 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.549485 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-config-data" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.550764 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-cinder-dockercfg-wt6v7" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.552760 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gccpm\" (UniqueName: \"kubernetes.io/projected/c97c36d1-1f8b-4e57-952e-19550c836b71-kube-api-access-gccpm\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.552840 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-db-sync-config-data\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.552878 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-config-data\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.552953 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-combined-ca-bundle\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.553085 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c97c36d1-1f8b-4e57-952e-19550c836b71-etc-machine-id\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.553199 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-scripts\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.559786 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scripts" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.560275 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-wl49x"] Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.654215 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-config-data\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.654275 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-combined-ca-bundle\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.654332 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c97c36d1-1f8b-4e57-952e-19550c836b71-etc-machine-id\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.654389 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-scripts\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.654434 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gccpm\" (UniqueName: \"kubernetes.io/projected/c97c36d1-1f8b-4e57-952e-19550c836b71-kube-api-access-gccpm\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.654447 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c97c36d1-1f8b-4e57-952e-19550c836b71-etc-machine-id\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.654470 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-db-sync-config-data\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.659424 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-db-sync-config-data\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.660537 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-config-data\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.660697 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-combined-ca-bundle\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.660871 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-scripts\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.671907 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gccpm\" (UniqueName: \"kubernetes.io/projected/c97c36d1-1f8b-4e57-952e-19550c836b71-kube-api-access-gccpm\") pod \"cinder-db-sync-wl49x\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:01 crc kubenswrapper[4794]: I1215 14:23:01.863746 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:02 crc kubenswrapper[4794]: I1215 14:23:02.427678 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-wl49x"] Dec 15 14:23:02 crc kubenswrapper[4794]: W1215 14:23:02.431363 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc97c36d1_1f8b_4e57_952e_19550c836b71.slice/crio-b196909ef2cf0eef81a7f9f345255464480a1e744374767208ac240e1da3e952 WatchSource:0}: Error finding container b196909ef2cf0eef81a7f9f345255464480a1e744374767208ac240e1da3e952: Status 404 returned error can't find the container with id b196909ef2cf0eef81a7f9f345255464480a1e744374767208ac240e1da3e952 Dec 15 14:23:02 crc kubenswrapper[4794]: I1215 14:23:02.700909 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-wl49x" event={"ID":"c97c36d1-1f8b-4e57-952e-19550c836b71","Type":"ContainerStarted","Data":"b196909ef2cf0eef81a7f9f345255464480a1e744374767208ac240e1da3e952"} Dec 15 14:23:02 crc kubenswrapper[4794]: I1215 14:23:02.729605 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:03 crc kubenswrapper[4794]: I1215 14:23:03.927005 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:05 crc kubenswrapper[4794]: I1215 14:23:05.109755 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:06 crc kubenswrapper[4794]: I1215 14:23:06.309049 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:07 crc kubenswrapper[4794]: I1215 14:23:07.083283 4794 scope.go:117] "RemoveContainer" containerID="16bce1b3c7c7ae605b4c5da5a921c6e06ee09cba862c3c611a5d02b91c6696d7" Dec 15 14:23:07 crc kubenswrapper[4794]: I1215 14:23:07.122353 4794 scope.go:117] "RemoveContainer" containerID="d95f3ca81c1d9be5aa62596822040202489e82ecb3a64fea78a2e03e3f9e4d4d" Dec 15 14:23:07 crc kubenswrapper[4794]: I1215 14:23:07.166522 4794 scope.go:117] "RemoveContainer" containerID="6b6665900b5f3994153d359afe69be57e774fe4797b3d08234da9b2d3e47ad19" Dec 15 14:23:07 crc kubenswrapper[4794]: I1215 14:23:07.492486 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:08 crc kubenswrapper[4794]: I1215 14:23:08.694024 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:09 crc kubenswrapper[4794]: I1215 14:23:09.892548 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:10 crc kubenswrapper[4794]: I1215 14:23:10.737228 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:23:10 crc kubenswrapper[4794]: E1215 14:23:10.737493 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:23:11 crc kubenswrapper[4794]: I1215 14:23:11.085077 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:12 crc kubenswrapper[4794]: I1215 14:23:12.323645 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:13 crc kubenswrapper[4794]: I1215 14:23:13.527974 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:14 crc kubenswrapper[4794]: I1215 14:23:14.733048 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:15 crc kubenswrapper[4794]: I1215 14:23:15.931227 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:17 crc kubenswrapper[4794]: I1215 14:23:17.114813 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:18 crc kubenswrapper[4794]: I1215 14:23:18.341659 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:18 crc kubenswrapper[4794]: E1215 14:23:18.493287 4794 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 15 14:23:18 crc kubenswrapper[4794]: E1215 14:23:18.493520 4794 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gccpm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-wl49x_watcher-kuttl-default(c97c36d1-1f8b-4e57-952e-19550c836b71): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 15 14:23:18 crc kubenswrapper[4794]: E1215 14:23:18.494640 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/cinder-db-sync-wl49x" podUID="c97c36d1-1f8b-4e57-952e-19550c836b71" Dec 15 14:23:18 crc kubenswrapper[4794]: E1215 14:23:18.857880 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="watcher-kuttl-default/cinder-db-sync-wl49x" podUID="c97c36d1-1f8b-4e57-952e-19550c836b71" Dec 15 14:23:19 crc kubenswrapper[4794]: I1215 14:23:19.532510 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:20 crc kubenswrapper[4794]: I1215 14:23:20.339490 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:23:21 crc kubenswrapper[4794]: I1215 14:23:21.355306 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:22 crc kubenswrapper[4794]: I1215 14:23:22.574163 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:22 crc kubenswrapper[4794]: I1215 14:23:22.737826 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:23:22 crc kubenswrapper[4794]: E1215 14:23:22.738065 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:23:23 crc kubenswrapper[4794]: I1215 14:23:23.780710 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:24 crc kubenswrapper[4794]: I1215 14:23:24.989785 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:26 crc kubenswrapper[4794]: I1215 14:23:26.184195 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:27 crc kubenswrapper[4794]: I1215 14:23:27.381643 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:28 crc kubenswrapper[4794]: I1215 14:23:28.580735 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:29 crc kubenswrapper[4794]: I1215 14:23:29.782076 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:30 crc kubenswrapper[4794]: I1215 14:23:30.953181 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:32 crc kubenswrapper[4794]: I1215 14:23:32.147100 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:33 crc kubenswrapper[4794]: I1215 14:23:33.363744 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:34 crc kubenswrapper[4794]: I1215 14:23:34.567149 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:35 crc kubenswrapper[4794]: I1215 14:23:35.811501 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:37 crc kubenswrapper[4794]: I1215 14:23:37.066240 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:37 crc kubenswrapper[4794]: I1215 14:23:37.737910 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:23:37 crc kubenswrapper[4794]: E1215 14:23:37.738427 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:23:38 crc kubenswrapper[4794]: I1215 14:23:38.016815 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-wl49x" event={"ID":"c97c36d1-1f8b-4e57-952e-19550c836b71","Type":"ContainerStarted","Data":"2984520b934caa937c9ae0139a16877a7696a351a94545275d12cf034ff8ddd2"} Dec 15 14:23:38 crc kubenswrapper[4794]: I1215 14:23:38.040170 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-db-sync-wl49x" podStartSLOduration=3.116807973 podStartE2EDuration="37.040151206s" podCreationTimestamp="2025-12-15 14:23:01 +0000 UTC" firstStartedPulling="2025-12-15 14:23:02.433447403 +0000 UTC m=+1744.285469841" lastFinishedPulling="2025-12-15 14:23:36.356790636 +0000 UTC m=+1778.208813074" observedRunningTime="2025-12-15 14:23:38.039961241 +0000 UTC m=+1779.891983689" watchObservedRunningTime="2025-12-15 14:23:38.040151206 +0000 UTC m=+1779.892173654" Dec 15 14:23:38 crc kubenswrapper[4794]: I1215 14:23:38.258955 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:39 crc kubenswrapper[4794]: I1215 14:23:39.504505 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:40 crc kubenswrapper[4794]: I1215 14:23:40.680222 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:41 crc kubenswrapper[4794]: I1215 14:23:41.047602 4794 generic.go:334] "Generic (PLEG): container finished" podID="c97c36d1-1f8b-4e57-952e-19550c836b71" containerID="2984520b934caa937c9ae0139a16877a7696a351a94545275d12cf034ff8ddd2" exitCode=0 Dec 15 14:23:41 crc kubenswrapper[4794]: I1215 14:23:41.047648 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-wl49x" event={"ID":"c97c36d1-1f8b-4e57-952e-19550c836b71","Type":"ContainerDied","Data":"2984520b934caa937c9ae0139a16877a7696a351a94545275d12cf034ff8ddd2"} Dec 15 14:23:41 crc kubenswrapper[4794]: I1215 14:23:41.939827 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.350130 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.457276 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-scripts\") pod \"c97c36d1-1f8b-4e57-952e-19550c836b71\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.457363 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c97c36d1-1f8b-4e57-952e-19550c836b71-etc-machine-id\") pod \"c97c36d1-1f8b-4e57-952e-19550c836b71\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.457514 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-combined-ca-bundle\") pod \"c97c36d1-1f8b-4e57-952e-19550c836b71\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.457553 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-db-sync-config-data\") pod \"c97c36d1-1f8b-4e57-952e-19550c836b71\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.457618 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-config-data\") pod \"c97c36d1-1f8b-4e57-952e-19550c836b71\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.457680 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gccpm\" (UniqueName: \"kubernetes.io/projected/c97c36d1-1f8b-4e57-952e-19550c836b71-kube-api-access-gccpm\") pod \"c97c36d1-1f8b-4e57-952e-19550c836b71\" (UID: \"c97c36d1-1f8b-4e57-952e-19550c836b71\") " Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.458771 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c97c36d1-1f8b-4e57-952e-19550c836b71-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c97c36d1-1f8b-4e57-952e-19550c836b71" (UID: "c97c36d1-1f8b-4e57-952e-19550c836b71"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.464238 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c97c36d1-1f8b-4e57-952e-19550c836b71" (UID: "c97c36d1-1f8b-4e57-952e-19550c836b71"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.466236 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-scripts" (OuterVolumeSpecName: "scripts") pod "c97c36d1-1f8b-4e57-952e-19550c836b71" (UID: "c97c36d1-1f8b-4e57-952e-19550c836b71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.466529 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c97c36d1-1f8b-4e57-952e-19550c836b71-kube-api-access-gccpm" (OuterVolumeSpecName: "kube-api-access-gccpm") pod "c97c36d1-1f8b-4e57-952e-19550c836b71" (UID: "c97c36d1-1f8b-4e57-952e-19550c836b71"). InnerVolumeSpecName "kube-api-access-gccpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.492512 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c97c36d1-1f8b-4e57-952e-19550c836b71" (UID: "c97c36d1-1f8b-4e57-952e-19550c836b71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.531743 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-config-data" (OuterVolumeSpecName: "config-data") pod "c97c36d1-1f8b-4e57-952e-19550c836b71" (UID: "c97c36d1-1f8b-4e57-952e-19550c836b71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.559354 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.559399 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c97c36d1-1f8b-4e57-952e-19550c836b71-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.559416 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.559432 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.559449 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c97c36d1-1f8b-4e57-952e-19550c836b71-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:42 crc kubenswrapper[4794]: I1215 14:23:42.559466 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gccpm\" (UniqueName: \"kubernetes.io/projected/c97c36d1-1f8b-4e57-952e-19550c836b71-kube-api-access-gccpm\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.066275 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-wl49x" event={"ID":"c97c36d1-1f8b-4e57-952e-19550c836b71","Type":"ContainerDied","Data":"b196909ef2cf0eef81a7f9f345255464480a1e744374767208ac240e1da3e952"} Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.066312 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b196909ef2cf0eef81a7f9f345255464480a1e744374767208ac240e1da3e952" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.066345 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-wl49x" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.155837 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.370626 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:23:43 crc kubenswrapper[4794]: E1215 14:23:43.370945 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97c36d1-1f8b-4e57-952e-19550c836b71" containerName="cinder-db-sync" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.370960 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97c36d1-1f8b-4e57-952e-19550c836b71" containerName="cinder-db-sync" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.371126 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c97c36d1-1f8b-4e57-952e-19550c836b71" containerName="cinder-db-sync" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.371944 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.375277 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-config-data" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.375981 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scripts" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.379636 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scheduler-config-data" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.386334 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-cinder-dockercfg-wt6v7" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.400444 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.402095 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.403665 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-backup-config-data" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.410479 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.427872 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.473750 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.473799 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-scripts\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.473851 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80999d06-5409-493d-a112-aec9648f237c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.473920 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.473978 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.474001 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2xt4\" (UniqueName: \"kubernetes.io/projected/80999d06-5409-493d-a112-aec9648f237c-kube-api-access-l2xt4\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.474026 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.575908 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.575955 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80999d06-5409-493d-a112-aec9648f237c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.575983 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-sys\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576000 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-dev\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576013 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576032 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576051 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576074 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576089 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576105 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt6kz\" (UniqueName: \"kubernetes.io/projected/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-kube-api-access-lt6kz\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576121 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-lib-modules\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576136 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576167 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576194 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576218 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2xt4\" (UniqueName: \"kubernetes.io/projected/80999d06-5409-493d-a112-aec9648f237c-kube-api-access-l2xt4\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576244 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576259 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-run\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576279 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576304 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-scripts\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576319 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576363 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576379 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.576395 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-scripts\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.577222 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80999d06-5409-493d-a112-aec9648f237c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.583257 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.583604 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-scripts\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.597555 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.603117 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.616153 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.619084 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2xt4\" (UniqueName: \"kubernetes.io/projected/80999d06-5409-493d-a112-aec9648f237c-kube-api-access-l2xt4\") pod \"cinder-scheduler-0\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678575 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-run\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678644 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678672 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-scripts\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678687 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678720 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678740 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678760 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-sys\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678773 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-dev\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678787 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678802 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678822 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678836 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678852 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt6kz\" (UniqueName: \"kubernetes.io/projected/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-kube-api-access-lt6kz\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678866 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-lib-modules\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678882 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.678913 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.679809 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-run\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.679852 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.689382 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-sys\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.689445 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.689702 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.689789 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.689818 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-lib-modules\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.690059 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-dev\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.691087 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.691245 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.691550 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.694509 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.700399 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.730258 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.730868 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-scripts\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.731265 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.755209 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt6kz\" (UniqueName: \"kubernetes.io/projected/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-kube-api-access-lt6kz\") pod \"cinder-backup-0\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.799646 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.802061 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.807973 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-api-config-data" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.842292 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.882771 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-scripts\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.882847 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb7xv\" (UniqueName: \"kubernetes.io/projected/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-kube-api-access-xb7xv\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.882870 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-logs\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.882914 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.882974 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.883023 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.883045 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data-custom\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.883059 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.984691 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.984735 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data-custom\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.984754 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.984852 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.984882 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-scripts\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.985070 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb7xv\" (UniqueName: \"kubernetes.io/projected/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-kube-api-access-xb7xv\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.985152 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-logs\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.985297 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.985366 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.987335 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-logs\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.991907 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.994384 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:43 crc kubenswrapper[4794]: I1215 14:23:43.996067 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data-custom\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:44 crc kubenswrapper[4794]: I1215 14:23:43.999871 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-scripts\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:44 crc kubenswrapper[4794]: I1215 14:23:44.005225 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:44 crc kubenswrapper[4794]: I1215 14:23:44.011021 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb7xv\" (UniqueName: \"kubernetes.io/projected/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-kube-api-access-xb7xv\") pod \"cinder-api-0\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:44 crc kubenswrapper[4794]: I1215 14:23:44.020571 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:44 crc kubenswrapper[4794]: I1215 14:23:44.175410 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:44 crc kubenswrapper[4794]: I1215 14:23:44.248026 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:23:44 crc kubenswrapper[4794]: I1215 14:23:44.343330 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:44 crc kubenswrapper[4794]: I1215 14:23:44.538704 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:23:44 crc kubenswrapper[4794]: I1215 14:23:44.725753 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:23:44 crc kubenswrapper[4794]: W1215 14:23:44.734024 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod086b7f5f_f1f5_4ab6_bf83_9f0312a167b1.slice/crio-2f2ac62d3709466fb9c95954da7d818f0cb75402996eed5f2738437f16e4f44c WatchSource:0}: Error finding container 2f2ac62d3709466fb9c95954da7d818f0cb75402996eed5f2738437f16e4f44c: Status 404 returned error can't find the container with id 2f2ac62d3709466fb9c95954da7d818f0cb75402996eed5f2738437f16e4f44c Dec 15 14:23:45 crc kubenswrapper[4794]: I1215 14:23:45.103644 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1","Type":"ContainerStarted","Data":"2f2ac62d3709466fb9c95954da7d818f0cb75402996eed5f2738437f16e4f44c"} Dec 15 14:23:45 crc kubenswrapper[4794]: I1215 14:23:45.107196 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6","Type":"ContainerStarted","Data":"be11bfaacd2576d63776b08436c461c59e16f9842fe8391b5e6319bcdb000ea7"} Dec 15 14:23:45 crc kubenswrapper[4794]: I1215 14:23:45.108369 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"80999d06-5409-493d-a112-aec9648f237c","Type":"ContainerStarted","Data":"2f55c54b443808540145ef729b873142c562ad17b2ca67ec1a5ed4bf22b232f1"} Dec 15 14:23:45 crc kubenswrapper[4794]: I1215 14:23:45.581295 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:46 crc kubenswrapper[4794]: I1215 14:23:46.126961 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1","Type":"ContainerStarted","Data":"e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622"} Dec 15 14:23:46 crc kubenswrapper[4794]: I1215 14:23:46.142137 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6","Type":"ContainerStarted","Data":"c2540479675f7f8ea0ec4dbc61e4f7ffb499efc879ea7961950f6eb0fa64579b"} Dec 15 14:23:46 crc kubenswrapper[4794]: I1215 14:23:46.142194 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6","Type":"ContainerStarted","Data":"ebf444e1a723d847039bb8fab39571c92b05702aa1eea141c540ec436a76d666"} Dec 15 14:23:46 crc kubenswrapper[4794]: I1215 14:23:46.168621 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-backup-0" podStartSLOduration=2.241274261 podStartE2EDuration="3.168596775s" podCreationTimestamp="2025-12-15 14:23:43 +0000 UTC" firstStartedPulling="2025-12-15 14:23:44.541816336 +0000 UTC m=+1786.393838774" lastFinishedPulling="2025-12-15 14:23:45.46913885 +0000 UTC m=+1787.321161288" observedRunningTime="2025-12-15 14:23:46.166639309 +0000 UTC m=+1788.018661757" watchObservedRunningTime="2025-12-15 14:23:46.168596775 +0000 UTC m=+1788.020619233" Dec 15 14:23:46 crc kubenswrapper[4794]: I1215 14:23:46.421429 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:23:46 crc kubenswrapper[4794]: I1215 14:23:46.805399 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:47 crc kubenswrapper[4794]: I1215 14:23:47.163895 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"80999d06-5409-493d-a112-aec9648f237c","Type":"ContainerStarted","Data":"93d22510286cdb31d0fc3ebd8a70db9e999d0e53ce35d44752bd87f0ca4e7a55"} Dec 15 14:23:47 crc kubenswrapper[4794]: I1215 14:23:47.173177 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1","Type":"ContainerStarted","Data":"f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207"} Dec 15 14:23:47 crc kubenswrapper[4794]: I1215 14:23:47.192706 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-api-0" podStartSLOduration=4.192689916 podStartE2EDuration="4.192689916s" podCreationTimestamp="2025-12-15 14:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:23:47.190782812 +0000 UTC m=+1789.042805250" watchObservedRunningTime="2025-12-15 14:23:47.192689916 +0000 UTC m=+1789.044712364" Dec 15 14:23:47 crc kubenswrapper[4794]: I1215 14:23:47.981087 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.184172 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"80999d06-5409-493d-a112-aec9648f237c","Type":"ContainerStarted","Data":"4369fa52dc4006c5d12a87c1fff6c28e0be6e6c6ca7b06e5165658e11612afab"} Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.184342 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.184367 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" containerName="cinder-api-log" containerID="cri-o://e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622" gracePeriod=30 Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.184383 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" containerName="cinder-api" containerID="cri-o://f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207" gracePeriod=30 Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.215508 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-scheduler-0" podStartSLOduration=3.615745678 podStartE2EDuration="5.215494482s" podCreationTimestamp="2025-12-15 14:23:43 +0000 UTC" firstStartedPulling="2025-12-15 14:23:44.276043304 +0000 UTC m=+1786.128065742" lastFinishedPulling="2025-12-15 14:23:45.875792098 +0000 UTC m=+1787.727814546" observedRunningTime="2025-12-15 14:23:48.211201651 +0000 UTC m=+1790.063224089" watchObservedRunningTime="2025-12-15 14:23:48.215494482 +0000 UTC m=+1790.067516920" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.692552 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.791281 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.878374 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-combined-ca-bundle\") pod \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.878432 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data-custom\") pod \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.878471 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data\") pod \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.878487 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-scripts\") pod \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.878515 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb7xv\" (UniqueName: \"kubernetes.io/projected/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-kube-api-access-xb7xv\") pod \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.878547 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-cert-memcached-mtls\") pod \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.878655 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-etc-machine-id\") pod \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.878689 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-logs\") pod \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\" (UID: \"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1\") " Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.879385 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" (UID: "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.879853 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-logs" (OuterVolumeSpecName: "logs") pod "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" (UID: "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.896216 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-scripts" (OuterVolumeSpecName: "scripts") pod "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" (UID: "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.901772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" (UID: "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.902735 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-kube-api-access-xb7xv" (OuterVolumeSpecName: "kube-api-access-xb7xv") pod "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" (UID: "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1"). InnerVolumeSpecName "kube-api-access-xb7xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.908131 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" (UID: "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.957723 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data" (OuterVolumeSpecName: "config-data") pod "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" (UID: "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.957736 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" (UID: "086b7f5f-f1f5-4ab6-bf83-9f0312a167b1"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.980992 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb7xv\" (UniqueName: \"kubernetes.io/projected/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-kube-api-access-xb7xv\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.981025 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.981034 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.981042 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.981053 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.981062 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.981070 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:48 crc kubenswrapper[4794]: I1215 14:23:48.981078 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.021328 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.167215 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.196320 4794 generic.go:334] "Generic (PLEG): container finished" podID="086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" containerID="f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207" exitCode=0 Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.196353 4794 generic.go:334] "Generic (PLEG): container finished" podID="086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" containerID="e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622" exitCode=143 Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.196378 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.196424 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1","Type":"ContainerDied","Data":"f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207"} Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.196448 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1","Type":"ContainerDied","Data":"e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622"} Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.196459 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"086b7f5f-f1f5-4ab6-bf83-9f0312a167b1","Type":"ContainerDied","Data":"2f2ac62d3709466fb9c95954da7d818f0cb75402996eed5f2738437f16e4f44c"} Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.196473 4794 scope.go:117] "RemoveContainer" containerID="f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.229087 4794 scope.go:117] "RemoveContainer" containerID="e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.242525 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.262669 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.272181 4794 scope.go:117] "RemoveContainer" containerID="f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207" Dec 15 14:23:49 crc kubenswrapper[4794]: E1215 14:23:49.274061 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207\": container with ID starting with f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207 not found: ID does not exist" containerID="f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.274106 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207"} err="failed to get container status \"f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207\": rpc error: code = NotFound desc = could not find container \"f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207\": container with ID starting with f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207 not found: ID does not exist" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.274135 4794 scope.go:117] "RemoveContainer" containerID="e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622" Dec 15 14:23:49 crc kubenswrapper[4794]: E1215 14:23:49.274391 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622\": container with ID starting with e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622 not found: ID does not exist" containerID="e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.274419 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622"} err="failed to get container status \"e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622\": rpc error: code = NotFound desc = could not find container \"e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622\": container with ID starting with e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622 not found: ID does not exist" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.274436 4794 scope.go:117] "RemoveContainer" containerID="f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.274745 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207"} err="failed to get container status \"f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207\": rpc error: code = NotFound desc = could not find container \"f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207\": container with ID starting with f8a1a595904392de6d6bb42e76de26038c1d1106f8ed28551954e48506172207 not found: ID does not exist" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.274769 4794 scope.go:117] "RemoveContainer" containerID="e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.275516 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622"} err="failed to get container status \"e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622\": rpc error: code = NotFound desc = could not find container \"e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622\": container with ID starting with e4f38f76f7109e9ace9e17786137635e82b623750bec7b68108363ad958d5622 not found: ID does not exist" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.277399 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:23:49 crc kubenswrapper[4794]: E1215 14:23:49.277974 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" containerName="cinder-api-log" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.278004 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" containerName="cinder-api-log" Dec 15 14:23:49 crc kubenswrapper[4794]: E1215 14:23:49.278036 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" containerName="cinder-api" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.278045 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" containerName="cinder-api" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.278410 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" containerName="cinder-api-log" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.278445 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" containerName="cinder-api" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.289705 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.289822 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.294281 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-cinder-internal-svc" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.294550 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-cinder-public-svc" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.294730 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-api-config-data" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.389000 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.389079 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/113c107d-bfbc-43bd-bddb-401c41f0ac1b-logs\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.389120 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc2m4\" (UniqueName: \"kubernetes.io/projected/113c107d-bfbc-43bd-bddb-401c41f0ac1b-kube-api-access-zc2m4\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.389151 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data-custom\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.389179 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-scripts\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.389218 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/113c107d-bfbc-43bd-bddb-401c41f0ac1b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.389233 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.389260 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.389322 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.389360 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.490774 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.491062 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.491091 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.491137 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/113c107d-bfbc-43bd-bddb-401c41f0ac1b-logs\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.491160 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc2m4\" (UniqueName: \"kubernetes.io/projected/113c107d-bfbc-43bd-bddb-401c41f0ac1b-kube-api-access-zc2m4\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.491180 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data-custom\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.491198 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-scripts\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.491224 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/113c107d-bfbc-43bd-bddb-401c41f0ac1b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.491238 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.491820 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.492107 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/113c107d-bfbc-43bd-bddb-401c41f0ac1b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.492357 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/113c107d-bfbc-43bd-bddb-401c41f0ac1b-logs\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.495654 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.497559 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-scripts\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.497810 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data-custom\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.497985 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.498105 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.499668 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.502300 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.514458 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc2m4\" (UniqueName: \"kubernetes.io/projected/113c107d-bfbc-43bd-bddb-401c41f0ac1b-kube-api-access-zc2m4\") pod \"cinder-api-0\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:49 crc kubenswrapper[4794]: I1215 14:23:49.659863 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:50 crc kubenswrapper[4794]: I1215 14:23:50.188443 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:23:50 crc kubenswrapper[4794]: I1215 14:23:50.205895 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"113c107d-bfbc-43bd-bddb-401c41f0ac1b","Type":"ContainerStarted","Data":"a1a713c7dc815e72936f3152fb380272c9f8566167278d5867f3b64650a4e4f0"} Dec 15 14:23:50 crc kubenswrapper[4794]: I1215 14:23:50.416737 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:50 crc kubenswrapper[4794]: I1215 14:23:50.752316 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="086b7f5f-f1f5-4ab6-bf83-9f0312a167b1" path="/var/lib/kubelet/pods/086b7f5f-f1f5-4ab6-bf83-9f0312a167b1/volumes" Dec 15 14:23:51 crc kubenswrapper[4794]: I1215 14:23:51.217862 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"113c107d-bfbc-43bd-bddb-401c41f0ac1b","Type":"ContainerStarted","Data":"11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705"} Dec 15 14:23:51 crc kubenswrapper[4794]: I1215 14:23:51.626293 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:51 crc kubenswrapper[4794]: I1215 14:23:51.737452 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:23:51 crc kubenswrapper[4794]: E1215 14:23:51.737886 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:23:52 crc kubenswrapper[4794]: I1215 14:23:52.228273 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"113c107d-bfbc-43bd-bddb-401c41f0ac1b","Type":"ContainerStarted","Data":"eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91"} Dec 15 14:23:52 crc kubenswrapper[4794]: I1215 14:23:52.228755 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:23:52 crc kubenswrapper[4794]: I1215 14:23:52.260683 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-api-0" podStartSLOduration=3.260655582 podStartE2EDuration="3.260655582s" podCreationTimestamp="2025-12-15 14:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:23:52.2503414 +0000 UTC m=+1794.102363848" watchObservedRunningTime="2025-12-15 14:23:52.260655582 +0000 UTC m=+1794.112678040" Dec 15 14:23:52 crc kubenswrapper[4794]: I1215 14:23:52.806597 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:53 crc kubenswrapper[4794]: I1215 14:23:53.954854 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:53 crc kubenswrapper[4794]: I1215 14:23:53.988291 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:54 crc kubenswrapper[4794]: I1215 14:23:54.045826 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:23:54 crc kubenswrapper[4794]: I1215 14:23:54.242434 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="80999d06-5409-493d-a112-aec9648f237c" containerName="cinder-scheduler" containerID="cri-o://93d22510286cdb31d0fc3ebd8a70db9e999d0e53ce35d44752bd87f0ca4e7a55" gracePeriod=30 Dec 15 14:23:54 crc kubenswrapper[4794]: I1215 14:23:54.242472 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="80999d06-5409-493d-a112-aec9648f237c" containerName="probe" containerID="cri-o://4369fa52dc4006c5d12a87c1fff6c28e0be6e6c6ca7b06e5165658e11612afab" gracePeriod=30 Dec 15 14:23:54 crc kubenswrapper[4794]: I1215 14:23:54.386758 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:54 crc kubenswrapper[4794]: I1215 14:23:54.432430 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.186730 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.253258 4794 generic.go:334] "Generic (PLEG): container finished" podID="80999d06-5409-493d-a112-aec9648f237c" containerID="4369fa52dc4006c5d12a87c1fff6c28e0be6e6c6ca7b06e5165658e11612afab" exitCode=0 Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.253295 4794 generic.go:334] "Generic (PLEG): container finished" podID="80999d06-5409-493d-a112-aec9648f237c" containerID="93d22510286cdb31d0fc3ebd8a70db9e999d0e53ce35d44752bd87f0ca4e7a55" exitCode=0 Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.253334 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"80999d06-5409-493d-a112-aec9648f237c","Type":"ContainerDied","Data":"4369fa52dc4006c5d12a87c1fff6c28e0be6e6c6ca7b06e5165658e11612afab"} Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.253381 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"80999d06-5409-493d-a112-aec9648f237c","Type":"ContainerDied","Data":"93d22510286cdb31d0fc3ebd8a70db9e999d0e53ce35d44752bd87f0ca4e7a55"} Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.253554 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" containerName="cinder-backup" containerID="cri-o://ebf444e1a723d847039bb8fab39571c92b05702aa1eea141c540ec436a76d666" gracePeriod=30 Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.253677 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" containerName="probe" containerID="cri-o://c2540479675f7f8ea0ec4dbc61e4f7ffb499efc879ea7961950f6eb0fa64579b" gracePeriod=30 Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.382790 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.384395 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="5432b186-d38f-40b5-9979-8d197844a905" containerName="watcher-decision-engine" containerID="cri-o://5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1" gracePeriod=30 Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.519307 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.596900 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data\") pod \"80999d06-5409-493d-a112-aec9648f237c\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.596964 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data-custom\") pod \"80999d06-5409-493d-a112-aec9648f237c\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.597036 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80999d06-5409-493d-a112-aec9648f237c-etc-machine-id\") pod \"80999d06-5409-493d-a112-aec9648f237c\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.597105 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-combined-ca-bundle\") pod \"80999d06-5409-493d-a112-aec9648f237c\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.597168 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-scripts\") pod \"80999d06-5409-493d-a112-aec9648f237c\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.597198 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-cert-memcached-mtls\") pod \"80999d06-5409-493d-a112-aec9648f237c\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.597215 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2xt4\" (UniqueName: \"kubernetes.io/projected/80999d06-5409-493d-a112-aec9648f237c-kube-api-access-l2xt4\") pod \"80999d06-5409-493d-a112-aec9648f237c\" (UID: \"80999d06-5409-493d-a112-aec9648f237c\") " Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.598528 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80999d06-5409-493d-a112-aec9648f237c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "80999d06-5409-493d-a112-aec9648f237c" (UID: "80999d06-5409-493d-a112-aec9648f237c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.605169 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80999d06-5409-493d-a112-aec9648f237c-kube-api-access-l2xt4" (OuterVolumeSpecName: "kube-api-access-l2xt4") pod "80999d06-5409-493d-a112-aec9648f237c" (UID: "80999d06-5409-493d-a112-aec9648f237c"). InnerVolumeSpecName "kube-api-access-l2xt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.605183 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-scripts" (OuterVolumeSpecName: "scripts") pod "80999d06-5409-493d-a112-aec9648f237c" (UID: "80999d06-5409-493d-a112-aec9648f237c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.605988 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "80999d06-5409-493d-a112-aec9648f237c" (UID: "80999d06-5409-493d-a112-aec9648f237c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.665922 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80999d06-5409-493d-a112-aec9648f237c" (UID: "80999d06-5409-493d-a112-aec9648f237c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.699224 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.699256 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80999d06-5409-493d-a112-aec9648f237c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.699268 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.699280 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.699292 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2xt4\" (UniqueName: \"kubernetes.io/projected/80999d06-5409-493d-a112-aec9648f237c-kube-api-access-l2xt4\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.735294 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data" (OuterVolumeSpecName: "config-data") pod "80999d06-5409-493d-a112-aec9648f237c" (UID: "80999d06-5409-493d-a112-aec9648f237c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.760131 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "80999d06-5409-493d-a112-aec9648f237c" (UID: "80999d06-5409-493d-a112-aec9648f237c"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.801332 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:55 crc kubenswrapper[4794]: I1215 14:23:55.801707 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/80999d06-5409-493d-a112-aec9648f237c-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.263692 4794 generic.go:334] "Generic (PLEG): container finished" podID="3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" containerID="c2540479675f7f8ea0ec4dbc61e4f7ffb499efc879ea7961950f6eb0fa64579b" exitCode=0 Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.263792 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6","Type":"ContainerDied","Data":"c2540479675f7f8ea0ec4dbc61e4f7ffb499efc879ea7961950f6eb0fa64579b"} Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.266079 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"80999d06-5409-493d-a112-aec9648f237c","Type":"ContainerDied","Data":"2f55c54b443808540145ef729b873142c562ad17b2ca67ec1a5ed4bf22b232f1"} Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.266142 4794 scope.go:117] "RemoveContainer" containerID="4369fa52dc4006c5d12a87c1fff6c28e0be6e6c6ca7b06e5165658e11612afab" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.266310 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.306388 4794 scope.go:117] "RemoveContainer" containerID="93d22510286cdb31d0fc3ebd8a70db9e999d0e53ce35d44752bd87f0ca4e7a55" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.309906 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.317366 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.332296 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:23:56 crc kubenswrapper[4794]: E1215 14:23:56.332717 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80999d06-5409-493d-a112-aec9648f237c" containerName="cinder-scheduler" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.332737 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="80999d06-5409-493d-a112-aec9648f237c" containerName="cinder-scheduler" Dec 15 14:23:56 crc kubenswrapper[4794]: E1215 14:23:56.332765 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80999d06-5409-493d-a112-aec9648f237c" containerName="probe" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.332774 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="80999d06-5409-493d-a112-aec9648f237c" containerName="probe" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.332962 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="80999d06-5409-493d-a112-aec9648f237c" containerName="cinder-scheduler" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.332996 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="80999d06-5409-493d-a112-aec9648f237c" containerName="probe" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.334071 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.336428 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scheduler-config-data" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.356015 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.403062 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.413659 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.413713 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.413782 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.413806 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-scripts\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.413852 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s566t\" (UniqueName: \"kubernetes.io/projected/029ea379-16dc-4fbe-9681-fbeb63bcc952-kube-api-access-s566t\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.413883 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.413905 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/029ea379-16dc-4fbe-9681-fbeb63bcc952-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.453610 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.454598 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="ceilometer-central-agent" containerID="cri-o://c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73" gracePeriod=30 Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.454712 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="sg-core" containerID="cri-o://bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1" gracePeriod=30 Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.454743 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="ceilometer-notification-agent" containerID="cri-o://29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866" gracePeriod=30 Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.454667 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="proxy-httpd" containerID="cri-o://b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b" gracePeriod=30 Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.515956 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.516012 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/029ea379-16dc-4fbe-9681-fbeb63bcc952-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.516070 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.516099 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.516158 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.516170 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/029ea379-16dc-4fbe-9681-fbeb63bcc952-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.516178 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-scripts\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.516716 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s566t\" (UniqueName: \"kubernetes.io/projected/029ea379-16dc-4fbe-9681-fbeb63bcc952-kube-api-access-s566t\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.520342 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-scripts\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.520769 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.522992 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.529822 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.534666 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.535200 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s566t\" (UniqueName: \"kubernetes.io/projected/029ea379-16dc-4fbe-9681-fbeb63bcc952-kube-api-access-s566t\") pod \"cinder-scheduler-0\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.668612 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:23:56 crc kubenswrapper[4794]: I1215 14:23:56.753496 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80999d06-5409-493d-a112-aec9648f237c" path="/var/lib/kubelet/pods/80999d06-5409-493d-a112-aec9648f237c/volumes" Dec 15 14:23:57 crc kubenswrapper[4794]: I1215 14:23:57.172673 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:23:57 crc kubenswrapper[4794]: I1215 14:23:57.293916 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"029ea379-16dc-4fbe-9681-fbeb63bcc952","Type":"ContainerStarted","Data":"41e0e1ca3d1c4ba7a5f01d9cdb8764248fdea6b7266cbb02b838995dee70e276"} Dec 15 14:23:57 crc kubenswrapper[4794]: I1215 14:23:57.297994 4794 generic.go:334] "Generic (PLEG): container finished" podID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerID="b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b" exitCode=0 Dec 15 14:23:57 crc kubenswrapper[4794]: I1215 14:23:57.298026 4794 generic.go:334] "Generic (PLEG): container finished" podID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerID="bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1" exitCode=2 Dec 15 14:23:57 crc kubenswrapper[4794]: I1215 14:23:57.298036 4794 generic.go:334] "Generic (PLEG): container finished" podID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerID="c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73" exitCode=0 Dec 15 14:23:57 crc kubenswrapper[4794]: I1215 14:23:57.298054 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76","Type":"ContainerDied","Data":"b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b"} Dec 15 14:23:57 crc kubenswrapper[4794]: I1215 14:23:57.298088 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76","Type":"ContainerDied","Data":"bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1"} Dec 15 14:23:57 crc kubenswrapper[4794]: I1215 14:23:57.298101 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76","Type":"ContainerDied","Data":"c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73"} Dec 15 14:23:57 crc kubenswrapper[4794]: I1215 14:23:57.626954 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:58 crc kubenswrapper[4794]: I1215 14:23:58.357159 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"029ea379-16dc-4fbe-9681-fbeb63bcc952","Type":"ContainerStarted","Data":"d8a2efb48013fcf6f4b7db86c11b367946302f6715a1f5f607699047432aad81"} Dec 15 14:23:58 crc kubenswrapper[4794]: I1215 14:23:58.809801 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_5432b186-d38f-40b5-9979-8d197844a905/watcher-decision-engine/0.log" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.232014 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.371043 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww7q8\" (UniqueName: \"kubernetes.io/projected/5432b186-d38f-40b5-9979-8d197844a905-kube-api-access-ww7q8\") pod \"5432b186-d38f-40b5-9979-8d197844a905\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.371116 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-cert-memcached-mtls\") pod \"5432b186-d38f-40b5-9979-8d197844a905\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.371271 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5432b186-d38f-40b5-9979-8d197844a905-logs\") pod \"5432b186-d38f-40b5-9979-8d197844a905\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.371334 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-combined-ca-bundle\") pod \"5432b186-d38f-40b5-9979-8d197844a905\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.371355 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-custom-prometheus-ca\") pod \"5432b186-d38f-40b5-9979-8d197844a905\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.371403 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-config-data\") pod \"5432b186-d38f-40b5-9979-8d197844a905\" (UID: \"5432b186-d38f-40b5-9979-8d197844a905\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.371734 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5432b186-d38f-40b5-9979-8d197844a905-logs" (OuterVolumeSpecName: "logs") pod "5432b186-d38f-40b5-9979-8d197844a905" (UID: "5432b186-d38f-40b5-9979-8d197844a905"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.381528 4794 generic.go:334] "Generic (PLEG): container finished" podID="5432b186-d38f-40b5-9979-8d197844a905" containerID="5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1" exitCode=0 Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.381632 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"5432b186-d38f-40b5-9979-8d197844a905","Type":"ContainerDied","Data":"5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1"} Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.381657 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"5432b186-d38f-40b5-9979-8d197844a905","Type":"ContainerDied","Data":"31512775badb369fdf71a901bf72b575e292ed8289eff0440ab0a577507b757e"} Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.381674 4794 scope.go:117] "RemoveContainer" containerID="5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.381796 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.381810 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5432b186-d38f-40b5-9979-8d197844a905-kube-api-access-ww7q8" (OuterVolumeSpecName: "kube-api-access-ww7q8") pod "5432b186-d38f-40b5-9979-8d197844a905" (UID: "5432b186-d38f-40b5-9979-8d197844a905"). InnerVolumeSpecName "kube-api-access-ww7q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.395809 4794 generic.go:334] "Generic (PLEG): container finished" podID="3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" containerID="ebf444e1a723d847039bb8fab39571c92b05702aa1eea141c540ec436a76d666" exitCode=0 Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.395887 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6","Type":"ContainerDied","Data":"ebf444e1a723d847039bb8fab39571c92b05702aa1eea141c540ec436a76d666"} Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.400808 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5432b186-d38f-40b5-9979-8d197844a905" (UID: "5432b186-d38f-40b5-9979-8d197844a905"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.407845 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"029ea379-16dc-4fbe-9681-fbeb63bcc952","Type":"ContainerStarted","Data":"084d0f42b92f6ca4b6944a7d7a157c3785777c03e634e8cc99c61ad69c96cc19"} Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.410719 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "5432b186-d38f-40b5-9979-8d197844a905" (UID: "5432b186-d38f-40b5-9979-8d197844a905"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.474954 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.475034 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.475206 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww7q8\" (UniqueName: \"kubernetes.io/projected/5432b186-d38f-40b5-9979-8d197844a905-kube-api-access-ww7q8\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.475228 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5432b186-d38f-40b5-9979-8d197844a905-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.489447 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-config-data" (OuterVolumeSpecName: "config-data") pod "5432b186-d38f-40b5-9979-8d197844a905" (UID: "5432b186-d38f-40b5-9979-8d197844a905"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.508865 4794 scope.go:117] "RemoveContainer" containerID="5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1" Dec 15 14:23:59 crc kubenswrapper[4794]: E1215 14:23:59.509521 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1\": container with ID starting with 5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1 not found: ID does not exist" containerID="5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.509561 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1"} err="failed to get container status \"5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1\": rpc error: code = NotFound desc = could not find container \"5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1\": container with ID starting with 5cbf8d69d547751a166fbcf121b7df746426e95f7d09e1986a5e6b2c55dae6d1 not found: ID does not exist" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.514781 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "5432b186-d38f-40b5-9979-8d197844a905" (UID: "5432b186-d38f-40b5-9979-8d197844a905"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.577912 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.577942 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5432b186-d38f-40b5-9979-8d197844a905-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.632970 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.696374 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-scheduler-0" podStartSLOduration=3.6963576959999997 podStartE2EDuration="3.696357696s" podCreationTimestamp="2025-12-15 14:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:23:59.441305307 +0000 UTC m=+1801.293327755" watchObservedRunningTime="2025-12-15 14:23:59.696357696 +0000 UTC m=+1801.548380134" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.741770 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.751838 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.769758 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:23:59 crc kubenswrapper[4794]: E1215 14:23:59.770167 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5432b186-d38f-40b5-9979-8d197844a905" containerName="watcher-decision-engine" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.770189 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="5432b186-d38f-40b5-9979-8d197844a905" containerName="watcher-decision-engine" Dec 15 14:23:59 crc kubenswrapper[4794]: E1215 14:23:59.770202 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" containerName="cinder-backup" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.770212 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" containerName="cinder-backup" Dec 15 14:23:59 crc kubenswrapper[4794]: E1215 14:23:59.770253 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" containerName="probe" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.770261 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" containerName="probe" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.770458 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="5432b186-d38f-40b5-9979-8d197844a905" containerName="watcher-decision-engine" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.770480 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" containerName="cinder-backup" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.770488 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" containerName="probe" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.771138 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.774326 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781282 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-brick\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781324 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-nvme\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781347 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-dev\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781392 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-lib-cinder\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781405 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-cinder\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781428 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-iscsi\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781463 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data-custom\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781483 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-cert-memcached-mtls\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781502 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-machine-id\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781529 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt6kz\" (UniqueName: \"kubernetes.io/projected/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-kube-api-access-lt6kz\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781556 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-run\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781591 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-combined-ca-bundle\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781611 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781642 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-lib-modules\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781709 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-scripts\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.781733 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-sys\") pod \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\" (UID: \"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6\") " Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.782123 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-sys" (OuterVolumeSpecName: "sys") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.782654 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.782685 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.782702 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-dev" (OuterVolumeSpecName: "dev") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.782742 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.782760 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.782776 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.785171 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.785335 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.788443 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-run" (OuterVolumeSpecName: "run") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.795510 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.797071 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-scripts" (OuterVolumeSpecName: "scripts") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.802640 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-kube-api-access-lt6kz" (OuterVolumeSpecName: "kube-api-access-lt6kz") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "kube-api-access-lt6kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.802703 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.842193 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.883074 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.883784 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.883835 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnwlc\" (UniqueName: \"kubernetes.io/projected/ac41a509-04ac-4a2a-b365-61599c44f9da-kube-api-access-cnwlc\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.883874 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.883910 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.883969 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac41a509-04ac-4a2a-b365-61599c44f9da-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884043 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884056 4794 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-sys\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884067 4794 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884079 4794 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884091 4794 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-dev\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884101 4794 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884110 4794 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884120 4794 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884129 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884139 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884148 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt6kz\" (UniqueName: \"kubernetes.io/projected/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-kube-api-access-lt6kz\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884157 4794 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-run\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884167 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.884176 4794 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.897450 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data" (OuterVolumeSpecName: "config-data") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.928329 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" (UID: "3f2ffd71-9a74-442e-9a32-eb5c907d1aa6"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.985992 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.986083 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac41a509-04ac-4a2a-b365-61599c44f9da-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.986158 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.986190 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.986236 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnwlc\" (UniqueName: \"kubernetes.io/projected/ac41a509-04ac-4a2a-b365-61599c44f9da-kube-api-access-cnwlc\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.986277 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.986330 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.986343 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.986776 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac41a509-04ac-4a2a-b365-61599c44f9da-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.989323 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.989665 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.992749 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:23:59 crc kubenswrapper[4794]: I1215 14:23:59.997110 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.006954 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnwlc\" (UniqueName: \"kubernetes.io/projected/ac41a509-04ac-4a2a-b365-61599c44f9da-kube-api-access-cnwlc\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.098182 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.419536 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.426975 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3f2ffd71-9a74-442e-9a32-eb5c907d1aa6","Type":"ContainerDied","Data":"be11bfaacd2576d63776b08436c461c59e16f9842fe8391b5e6319bcdb000ea7"} Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.427069 4794 scope.go:117] "RemoveContainer" containerID="c2540479675f7f8ea0ec4dbc61e4f7ffb499efc879ea7961950f6eb0fa64579b" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.453211 4794 scope.go:117] "RemoveContainer" containerID="ebf444e1a723d847039bb8fab39571c92b05702aa1eea141c540ec436a76d666" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.473023 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.483704 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.496148 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.497482 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.501984 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-backup-config-data" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.515900 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.557839 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:24:00 crc kubenswrapper[4794]: W1215 14:24:00.577071 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac41a509_04ac_4a2a_b365_61599c44f9da.slice/crio-28c240872d9ecbb8af50de240ba1a2b08bcaf823d7ed87616331d5feb692fb54 WatchSource:0}: Error finding container 28c240872d9ecbb8af50de240ba1a2b08bcaf823d7ed87616331d5feb692fb54: Status 404 returned error can't find the container with id 28c240872d9ecbb8af50de240ba1a2b08bcaf823d7ed87616331d5feb692fb54 Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.595896 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-lib-modules\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.595942 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.595969 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596009 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-nvme\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596040 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596067 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-sys\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596087 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596109 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596146 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-scripts\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596166 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596188 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596242 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wfm4\" (UniqueName: \"kubernetes.io/projected/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-kube-api-access-4wfm4\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596281 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-dev\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596318 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data-custom\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596340 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.596371 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-run\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.697718 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.697788 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.697855 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wfm4\" (UniqueName: \"kubernetes.io/projected/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-kube-api-access-4wfm4\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.697894 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-dev\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.697942 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data-custom\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.697964 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.698003 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-run\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.698035 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-lib-modules\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.698058 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.698083 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.698123 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-nvme\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.698157 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.698182 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-sys\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.698206 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.698232 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.698270 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-scripts\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.699009 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-lib-modules\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.699053 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-dev\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.699124 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.699288 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.699314 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.699340 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-nvme\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.699358 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.699379 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-run\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.699414 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-sys\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.699457 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.702281 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-scripts\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.704004 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data-custom\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.704227 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.704557 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.704550 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.716195 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wfm4\" (UniqueName: \"kubernetes.io/projected/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-kube-api-access-4wfm4\") pod \"cinder-backup-0\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.750563 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f2ffd71-9a74-442e-9a32-eb5c907d1aa6" path="/var/lib/kubelet/pods/3f2ffd71-9a74-442e-9a32-eb5c907d1aa6/volumes" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.751833 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5432b186-d38f-40b5-9979-8d197844a905" path="/var/lib/kubelet/pods/5432b186-d38f-40b5-9979-8d197844a905/volumes" Dec 15 14:24:00 crc kubenswrapper[4794]: I1215 14:24:00.819091 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:01 crc kubenswrapper[4794]: I1215 14:24:01.284289 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:24:01 crc kubenswrapper[4794]: I1215 14:24:01.427903 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"fb0abf6f-ac5a-4793-96ae-02221cbb00bf","Type":"ContainerStarted","Data":"e10af78f6577d45844503288f0d9431971a10bfed324fc3c8e69cd4e6b6ceb48"} Dec 15 14:24:01 crc kubenswrapper[4794]: I1215 14:24:01.429662 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ac41a509-04ac-4a2a-b365-61599c44f9da","Type":"ContainerStarted","Data":"8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af"} Dec 15 14:24:01 crc kubenswrapper[4794]: I1215 14:24:01.429687 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ac41a509-04ac-4a2a-b365-61599c44f9da","Type":"ContainerStarted","Data":"28c240872d9ecbb8af50de240ba1a2b08bcaf823d7ed87616331d5feb692fb54"} Dec 15 14:24:01 crc kubenswrapper[4794]: I1215 14:24:01.450912 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.450896239 podStartE2EDuration="2.450896239s" podCreationTimestamp="2025-12-15 14:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:24:01.443970323 +0000 UTC m=+1803.295992771" watchObservedRunningTime="2025-12-15 14:24:01.450896239 +0000 UTC m=+1803.302918677" Dec 15 14:24:01 crc kubenswrapper[4794]: I1215 14:24:01.669008 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:24:01 crc kubenswrapper[4794]: I1215 14:24:01.774877 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:24:02 crc kubenswrapper[4794]: I1215 14:24:02.352499 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:02 crc kubenswrapper[4794]: I1215 14:24:02.446112 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"fb0abf6f-ac5a-4793-96ae-02221cbb00bf","Type":"ContainerStarted","Data":"717081ba19ae34c169ea744a10b4b1b8d7b8db704b768e97238390df639b19d3"} Dec 15 14:24:02 crc kubenswrapper[4794]: I1215 14:24:02.446148 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"fb0abf6f-ac5a-4793-96ae-02221cbb00bf","Type":"ContainerStarted","Data":"d137d31e7c51b907fe8218663f3e8b3cf1a5402d9e65fd92c1666bd9dc760f2d"} Dec 15 14:24:02 crc kubenswrapper[4794]: I1215 14:24:02.476715 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-backup-0" podStartSLOduration=2.47669822 podStartE2EDuration="2.47669822s" podCreationTimestamp="2025-12-15 14:24:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:24:02.470220047 +0000 UTC m=+1804.322242495" watchObservedRunningTime="2025-12-15 14:24:02.47669822 +0000 UTC m=+1804.328720678" Dec 15 14:24:03 crc kubenswrapper[4794]: I1215 14:24:03.528618 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:04 crc kubenswrapper[4794]: I1215 14:24:04.779379 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:04 crc kubenswrapper[4794]: I1215 14:24:04.950307 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.079382 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-ceilometer-tls-certs\") pod \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.079437 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-run-httpd\") pod \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.079515 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4ttt\" (UniqueName: \"kubernetes.io/projected/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-kube-api-access-f4ttt\") pod \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.079551 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-combined-ca-bundle\") pod \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.079576 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-log-httpd\") pod \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.079632 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-config-data\") pod \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.079674 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-scripts\") pod \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.079698 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-sg-core-conf-yaml\") pod \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\" (UID: \"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76\") " Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.079980 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" (UID: "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.080506 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" (UID: "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.092852 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-kube-api-access-f4ttt" (OuterVolumeSpecName: "kube-api-access-f4ttt") pod "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" (UID: "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76"). InnerVolumeSpecName "kube-api-access-f4ttt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.094868 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-scripts" (OuterVolumeSpecName: "scripts") pod "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" (UID: "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.108435 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" (UID: "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.134036 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" (UID: "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.155649 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" (UID: "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.181453 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.181486 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4ttt\" (UniqueName: \"kubernetes.io/projected/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-kube-api-access-f4ttt\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.181495 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.181505 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.181513 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.181521 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.181529 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.186551 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-config-data" (OuterVolumeSpecName: "config-data") pod "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" (UID: "ce95d711-ab01-44f6-8a2b-dcf1b4b43f76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.282830 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.475933 4794 generic.go:334] "Generic (PLEG): container finished" podID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerID="29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866" exitCode=0 Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.475985 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76","Type":"ContainerDied","Data":"29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866"} Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.476017 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ce95d711-ab01-44f6-8a2b-dcf1b4b43f76","Type":"ContainerDied","Data":"5663a1fd1106d7d84e3342dd7b0079e55865ae32658de94421c574a4802e7d73"} Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.476037 4794 scope.go:117] "RemoveContainer" containerID="b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.476190 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.514942 4794 scope.go:117] "RemoveContainer" containerID="bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.525632 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.544010 4794 scope.go:117] "RemoveContainer" containerID="29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.567116 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.573100 4794 scope.go:117] "RemoveContainer" containerID="c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.589287 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:05 crc kubenswrapper[4794]: E1215 14:24:05.590496 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="proxy-httpd" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.590532 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="proxy-httpd" Dec 15 14:24:05 crc kubenswrapper[4794]: E1215 14:24:05.590567 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="ceilometer-central-agent" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.590601 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="ceilometer-central-agent" Dec 15 14:24:05 crc kubenswrapper[4794]: E1215 14:24:05.590624 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="ceilometer-notification-agent" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.590635 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="ceilometer-notification-agent" Dec 15 14:24:05 crc kubenswrapper[4794]: E1215 14:24:05.590672 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="sg-core" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.590682 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="sg-core" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.590961 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="ceilometer-central-agent" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.590993 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="sg-core" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.591021 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="ceilometer-notification-agent" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.591042 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" containerName="proxy-httpd" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.593830 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.599602 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.601005 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.601040 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.601227 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.648334 4794 scope.go:117] "RemoveContainer" containerID="b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b" Dec 15 14:24:05 crc kubenswrapper[4794]: E1215 14:24:05.648754 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b\": container with ID starting with b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b not found: ID does not exist" containerID="b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.648778 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b"} err="failed to get container status \"b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b\": rpc error: code = NotFound desc = could not find container \"b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b\": container with ID starting with b59595d7007d6b02e3dc9d502a6f75e0111c17bf204681e156c49ab5564f0c4b not found: ID does not exist" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.648798 4794 scope.go:117] "RemoveContainer" containerID="bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1" Dec 15 14:24:05 crc kubenswrapper[4794]: E1215 14:24:05.649118 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1\": container with ID starting with bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1 not found: ID does not exist" containerID="bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.649133 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1"} err="failed to get container status \"bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1\": rpc error: code = NotFound desc = could not find container \"bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1\": container with ID starting with bf94a646eb99d1adf776e4b408c04efe320869e7ec29ee0efcfa7db78f5ea7d1 not found: ID does not exist" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.649147 4794 scope.go:117] "RemoveContainer" containerID="29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866" Dec 15 14:24:05 crc kubenswrapper[4794]: E1215 14:24:05.649506 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866\": container with ID starting with 29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866 not found: ID does not exist" containerID="29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.649558 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866"} err="failed to get container status \"29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866\": rpc error: code = NotFound desc = could not find container \"29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866\": container with ID starting with 29bdf365e6c07ae8772e60f8c5c7bf62e0ebfe6770006cc43ed68f62cd4a8866 not found: ID does not exist" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.649615 4794 scope.go:117] "RemoveContainer" containerID="c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73" Dec 15 14:24:05 crc kubenswrapper[4794]: E1215 14:24:05.650053 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73\": container with ID starting with c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73 not found: ID does not exist" containerID="c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.650151 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73"} err="failed to get container status \"c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73\": rpc error: code = NotFound desc = could not find container \"c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73\": container with ID starting with c673c2e101955e2952c139e836b41428f3ecf951947c10e736092ad9924c5f73 not found: ID does not exist" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.697308 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.697534 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-run-httpd\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.697694 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-scripts\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.697781 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.697864 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5st\" (UniqueName: \"kubernetes.io/projected/017b7b46-6faf-41b9-994b-3f30078d6086-kube-api-access-rv5st\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.697944 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.698021 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-log-httpd\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.698087 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-config-data\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.799543 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-scripts\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.799642 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.799680 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5st\" (UniqueName: \"kubernetes.io/projected/017b7b46-6faf-41b9-994b-3f30078d6086-kube-api-access-rv5st\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.799703 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.799733 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-log-httpd\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.799747 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-config-data\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.799788 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.799811 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-run-httpd\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.800290 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-run-httpd\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.801474 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-log-httpd\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.806051 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.806110 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.806172 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.806538 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-scripts\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.806683 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-config-data\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.823663 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.824169 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5st\" (UniqueName: \"kubernetes.io/projected/017b7b46-6faf-41b9-994b-3f30078d6086-kube-api-access-rv5st\") pod \"ceilometer-0\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:05 crc kubenswrapper[4794]: I1215 14:24:05.956755 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:06 crc kubenswrapper[4794]: I1215 14:24:06.100198 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:06 crc kubenswrapper[4794]: I1215 14:24:06.598025 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:06 crc kubenswrapper[4794]: W1215 14:24:06.610751 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod017b7b46_6faf_41b9_994b_3f30078d6086.slice/crio-d8d9eab8c638cd1211f53f31b5b30368a011284df63a2db704996cabd052ee77 WatchSource:0}: Error finding container d8d9eab8c638cd1211f53f31b5b30368a011284df63a2db704996cabd052ee77: Status 404 returned error can't find the container with id d8d9eab8c638cd1211f53f31b5b30368a011284df63a2db704996cabd052ee77 Dec 15 14:24:06 crc kubenswrapper[4794]: I1215 14:24:06.747484 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce95d711-ab01-44f6-8a2b-dcf1b4b43f76" path="/var/lib/kubelet/pods/ce95d711-ab01-44f6-8a2b-dcf1b4b43f76/volumes" Dec 15 14:24:06 crc kubenswrapper[4794]: I1215 14:24:06.865699 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:24:07 crc kubenswrapper[4794]: I1215 14:24:07.287479 4794 scope.go:117] "RemoveContainer" containerID="05f8e2e23f7ae367489dc44d588c2ebf5dd149eb3e3b474358ad41e06e966d69" Dec 15 14:24:07 crc kubenswrapper[4794]: I1215 14:24:07.321366 4794 scope.go:117] "RemoveContainer" containerID="eca45b99c40e28cff944474b0e9f5460419af597bedfea01d5fdb79aaeb64dcf" Dec 15 14:24:07 crc kubenswrapper[4794]: I1215 14:24:07.331366 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:07 crc kubenswrapper[4794]: I1215 14:24:07.356596 4794 scope.go:117] "RemoveContainer" containerID="65825606fd7bd743ce43eb07d98d10115e303558b720ba3f0239a9c1c7486272" Dec 15 14:24:07 crc kubenswrapper[4794]: I1215 14:24:07.375876 4794 scope.go:117] "RemoveContainer" containerID="68905e4811cd3e0a006919c95853600fa8465a774793d694a28b9057cc5f7b39" Dec 15 14:24:07 crc kubenswrapper[4794]: I1215 14:24:07.394903 4794 scope.go:117] "RemoveContainer" containerID="1ba27cc53fa27620ff5b6f4db71b3310018a2f348c70d6d3bfb7eabd5585e8e4" Dec 15 14:24:07 crc kubenswrapper[4794]: I1215 14:24:07.418384 4794 scope.go:117] "RemoveContainer" containerID="4a9c8aec345f8741051df20d4437351ff67e6c017612f9ea4905e46f0d4a5a48" Dec 15 14:24:07 crc kubenswrapper[4794]: I1215 14:24:07.584769 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"017b7b46-6faf-41b9-994b-3f30078d6086","Type":"ContainerStarted","Data":"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb"} Dec 15 14:24:07 crc kubenswrapper[4794]: I1215 14:24:07.584815 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"017b7b46-6faf-41b9-994b-3f30078d6086","Type":"ContainerStarted","Data":"d8d9eab8c638cd1211f53f31b5b30368a011284df63a2db704996cabd052ee77"} Dec 15 14:24:07 crc kubenswrapper[4794]: I1215 14:24:07.737952 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:24:07 crc kubenswrapper[4794]: E1215 14:24:07.738518 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:24:08 crc kubenswrapper[4794]: I1215 14:24:08.538172 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:08 crc kubenswrapper[4794]: I1215 14:24:08.603999 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"017b7b46-6faf-41b9-994b-3f30078d6086","Type":"ContainerStarted","Data":"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1"} Dec 15 14:24:09 crc kubenswrapper[4794]: I1215 14:24:09.615500 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"017b7b46-6faf-41b9-994b-3f30078d6086","Type":"ContainerStarted","Data":"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b"} Dec 15 14:24:09 crc kubenswrapper[4794]: I1215 14:24:09.727524 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:10 crc kubenswrapper[4794]: I1215 14:24:10.099829 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:10 crc kubenswrapper[4794]: I1215 14:24:10.136350 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:10 crc kubenswrapper[4794]: I1215 14:24:10.623753 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:10 crc kubenswrapper[4794]: I1215 14:24:10.645457 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:10 crc kubenswrapper[4794]: I1215 14:24:10.896866 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:11 crc kubenswrapper[4794]: I1215 14:24:11.042088 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:11 crc kubenswrapper[4794]: I1215 14:24:11.061442 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-create-5rt74"] Dec 15 14:24:11 crc kubenswrapper[4794]: I1215 14:24:11.071213 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-create-5rt74"] Dec 15 14:24:11 crc kubenswrapper[4794]: I1215 14:24:11.634596 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"017b7b46-6faf-41b9-994b-3f30078d6086","Type":"ContainerStarted","Data":"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2"} Dec 15 14:24:11 crc kubenswrapper[4794]: I1215 14:24:11.664369 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.0738646530000002 podStartE2EDuration="6.664351815s" podCreationTimestamp="2025-12-15 14:24:05 +0000 UTC" firstStartedPulling="2025-12-15 14:24:06.612912477 +0000 UTC m=+1808.464934915" lastFinishedPulling="2025-12-15 14:24:11.203399639 +0000 UTC m=+1813.055422077" observedRunningTime="2025-12-15 14:24:11.65817749 +0000 UTC m=+1813.510199928" watchObservedRunningTime="2025-12-15 14:24:11.664351815 +0000 UTC m=+1813.516374253" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.084377 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.339516 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.378715 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-wl49x"] Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.385480 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-wl49x"] Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.429516 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.429903 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="fb0abf6f-ac5a-4793-96ae-02221cbb00bf" containerName="probe" containerID="cri-o://717081ba19ae34c169ea744a10b4b1b8d7b8db704b768e97238390df639b19d3" gracePeriod=30 Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.430046 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="fb0abf6f-ac5a-4793-96ae-02221cbb00bf" containerName="cinder-backup" containerID="cri-o://d137d31e7c51b907fe8218663f3e8b3cf1a5402d9e65fd92c1666bd9dc760f2d" gracePeriod=30 Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.443375 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.443684 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="029ea379-16dc-4fbe-9681-fbeb63bcc952" containerName="cinder-scheduler" containerID="cri-o://d8a2efb48013fcf6f4b7db86c11b367946302f6715a1f5f607699047432aad81" gracePeriod=30 Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.443846 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="029ea379-16dc-4fbe-9681-fbeb63bcc952" containerName="probe" containerID="cri-o://084d0f42b92f6ca4b6944a7d7a157c3785777c03e634e8cc99c61ad69c96cc19" gracePeriod=30 Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.479066 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.479295 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" containerName="cinder-api-log" containerID="cri-o://11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705" gracePeriod=30 Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.479658 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" containerName="cinder-api" containerID="cri-o://eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91" gracePeriod=30 Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.511642 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder2114-account-delete-6l8nn"] Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.513413 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder2114-account-delete-6l8nn" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.526935 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder2114-account-delete-6l8nn"] Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.552728 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-db-create-kkqg6"] Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.571838 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-db-create-kkqg6"] Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.621319 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lr6j\" (UniqueName: \"kubernetes.io/projected/d5bc3d83-5920-4ab7-8800-e113d5e7ca6c-kube-api-access-4lr6j\") pod \"cinder2114-account-delete-6l8nn\" (UID: \"d5bc3d83-5920-4ab7-8800-e113d5e7ca6c\") " pod="watcher-kuttl-default/cinder2114-account-delete-6l8nn" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.635876 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-2114-account-create-9w88k"] Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.646272 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-2114-account-create-9w88k"] Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.653753 4794 generic.go:334] "Generic (PLEG): container finished" podID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" containerID="11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705" exitCode=143 Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.654868 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"113c107d-bfbc-43bd-bddb-401c41f0ac1b","Type":"ContainerDied","Data":"11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705"} Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.654911 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.659329 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder2114-account-delete-6l8nn"] Dec 15 14:24:12 crc kubenswrapper[4794]: E1215 14:24:12.659734 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-4lr6j], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/cinder2114-account-delete-6l8nn" podUID="d5bc3d83-5920-4ab7-8800-e113d5e7ca6c" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.723429 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lr6j\" (UniqueName: \"kubernetes.io/projected/d5bc3d83-5920-4ab7-8800-e113d5e7ca6c-kube-api-access-4lr6j\") pod \"cinder2114-account-delete-6l8nn\" (UID: \"d5bc3d83-5920-4ab7-8800-e113d5e7ca6c\") " pod="watcher-kuttl-default/cinder2114-account-delete-6l8nn" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.743126 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lr6j\" (UniqueName: \"kubernetes.io/projected/d5bc3d83-5920-4ab7-8800-e113d5e7ca6c-kube-api-access-4lr6j\") pod \"cinder2114-account-delete-6l8nn\" (UID: \"d5bc3d83-5920-4ab7-8800-e113d5e7ca6c\") " pod="watcher-kuttl-default/cinder2114-account-delete-6l8nn" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.746628 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5551032f-6488-45e4-a75e-6895f05c408b" path="/var/lib/kubelet/pods/5551032f-6488-45e4-a75e-6895f05c408b/volumes" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.747201 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64a0caa7-1566-4c3b-8b9e-4ebc50494940" path="/var/lib/kubelet/pods/64a0caa7-1566-4c3b-8b9e-4ebc50494940/volumes" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.747678 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c97c36d1-1f8b-4e57-952e-19550c836b71" path="/var/lib/kubelet/pods/c97c36d1-1f8b-4e57-952e-19550c836b71/volumes" Dec 15 14:24:12 crc kubenswrapper[4794]: I1215 14:24:12.748127 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6c5bd48-31cd-4fa1-88c0-9afb783929eb" path="/var/lib/kubelet/pods/e6c5bd48-31cd-4fa1-88c0-9afb783929eb/volumes" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.525798 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.670759 4794 generic.go:334] "Generic (PLEG): container finished" podID="fb0abf6f-ac5a-4793-96ae-02221cbb00bf" containerID="717081ba19ae34c169ea744a10b4b1b8d7b8db704b768e97238390df639b19d3" exitCode=0 Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.671055 4794 generic.go:334] "Generic (PLEG): container finished" podID="fb0abf6f-ac5a-4793-96ae-02221cbb00bf" containerID="d137d31e7c51b907fe8218663f3e8b3cf1a5402d9e65fd92c1666bd9dc760f2d" exitCode=0 Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.671105 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"fb0abf6f-ac5a-4793-96ae-02221cbb00bf","Type":"ContainerDied","Data":"717081ba19ae34c169ea744a10b4b1b8d7b8db704b768e97238390df639b19d3"} Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.671136 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"fb0abf6f-ac5a-4793-96ae-02221cbb00bf","Type":"ContainerDied","Data":"d137d31e7c51b907fe8218663f3e8b3cf1a5402d9e65fd92c1666bd9dc760f2d"} Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.673715 4794 generic.go:334] "Generic (PLEG): container finished" podID="029ea379-16dc-4fbe-9681-fbeb63bcc952" containerID="084d0f42b92f6ca4b6944a7d7a157c3785777c03e634e8cc99c61ad69c96cc19" exitCode=0 Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.673741 4794 generic.go:334] "Generic (PLEG): container finished" podID="029ea379-16dc-4fbe-9681-fbeb63bcc952" containerID="d8a2efb48013fcf6f4b7db86c11b367946302f6715a1f5f607699047432aad81" exitCode=0 Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.674672 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"029ea379-16dc-4fbe-9681-fbeb63bcc952","Type":"ContainerDied","Data":"084d0f42b92f6ca4b6944a7d7a157c3785777c03e634e8cc99c61ad69c96cc19"} Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.674708 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"029ea379-16dc-4fbe-9681-fbeb63bcc952","Type":"ContainerDied","Data":"d8a2efb48013fcf6f4b7db86c11b367946302f6715a1f5f607699047432aad81"} Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.674741 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder2114-account-delete-6l8nn" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.684712 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder2114-account-delete-6l8nn" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.813932 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.846221 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lr6j\" (UniqueName: \"kubernetes.io/projected/d5bc3d83-5920-4ab7-8800-e113d5e7ca6c-kube-api-access-4lr6j\") pod \"d5bc3d83-5920-4ab7-8800-e113d5e7ca6c\" (UID: \"d5bc3d83-5920-4ab7-8800-e113d5e7ca6c\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.852772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5bc3d83-5920-4ab7-8800-e113d5e7ca6c-kube-api-access-4lr6j" (OuterVolumeSpecName: "kube-api-access-4lr6j") pod "d5bc3d83-5920-4ab7-8800-e113d5e7ca6c" (UID: "d5bc3d83-5920-4ab7-8800-e113d5e7ca6c"). InnerVolumeSpecName "kube-api-access-4lr6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947173 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947243 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data-custom\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947259 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-brick\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947311 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-lib-cinder\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947339 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-combined-ca-bundle\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947362 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-sys\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947388 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-run\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947423 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-lib-modules\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947438 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-cinder\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947463 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-nvme\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947490 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-iscsi\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947538 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-scripts\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947570 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-machine-id\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947604 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wfm4\" (UniqueName: \"kubernetes.io/projected/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-kube-api-access-4wfm4\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947621 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-cert-memcached-mtls\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947681 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-dev\") pod \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\" (UID: \"fb0abf6f-ac5a-4793-96ae-02221cbb00bf\") " Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.947919 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948007 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948050 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-sys" (OuterVolumeSpecName: "sys") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948060 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948079 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948084 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948027 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-run" (OuterVolumeSpecName: "run") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948103 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948405 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948425 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lr6j\" (UniqueName: \"kubernetes.io/projected/d5bc3d83-5920-4ab7-8800-e113d5e7ca6c-kube-api-access-4lr6j\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948442 4794 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948466 4794 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-sys\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948480 4794 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-run\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948491 4794 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948502 4794 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948513 4794 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948526 4794 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948440 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-dev" (OuterVolumeSpecName: "dev") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.948442 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.951204 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-scripts" (OuterVolumeSpecName: "scripts") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.952043 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-kube-api-access-4wfm4" (OuterVolumeSpecName: "kube-api-access-4wfm4") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "kube-api-access-4wfm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.953727 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.961716 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:24:13 crc kubenswrapper[4794]: I1215 14:24:13.998693 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.051011 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/029ea379-16dc-4fbe-9681-fbeb63bcc952-etc-machine-id\") pod \"029ea379-16dc-4fbe-9681-fbeb63bcc952\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.051709 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data-custom\") pod \"029ea379-16dc-4fbe-9681-fbeb63bcc952\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.051092 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/029ea379-16dc-4fbe-9681-fbeb63bcc952-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "029ea379-16dc-4fbe-9681-fbeb63bcc952" (UID: "029ea379-16dc-4fbe-9681-fbeb63bcc952"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.051761 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-cert-memcached-mtls\") pod \"029ea379-16dc-4fbe-9681-fbeb63bcc952\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.051784 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s566t\" (UniqueName: \"kubernetes.io/projected/029ea379-16dc-4fbe-9681-fbeb63bcc952-kube-api-access-s566t\") pod \"029ea379-16dc-4fbe-9681-fbeb63bcc952\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.051863 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-scripts\") pod \"029ea379-16dc-4fbe-9681-fbeb63bcc952\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.051887 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-combined-ca-bundle\") pod \"029ea379-16dc-4fbe-9681-fbeb63bcc952\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.051927 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data\") pod \"029ea379-16dc-4fbe-9681-fbeb63bcc952\" (UID: \"029ea379-16dc-4fbe-9681-fbeb63bcc952\") " Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.052997 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.053025 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.053038 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wfm4\" (UniqueName: \"kubernetes.io/projected/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-kube-api-access-4wfm4\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.053052 4794 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-dev\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.053062 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/029ea379-16dc-4fbe-9681-fbeb63bcc952-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.053073 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.053086 4794 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.055144 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-scripts" (OuterVolumeSpecName: "scripts") pod "029ea379-16dc-4fbe-9681-fbeb63bcc952" (UID: "029ea379-16dc-4fbe-9681-fbeb63bcc952"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.057288 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "029ea379-16dc-4fbe-9681-fbeb63bcc952" (UID: "029ea379-16dc-4fbe-9681-fbeb63bcc952"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.057345 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/029ea379-16dc-4fbe-9681-fbeb63bcc952-kube-api-access-s566t" (OuterVolumeSpecName: "kube-api-access-s566t") pod "029ea379-16dc-4fbe-9681-fbeb63bcc952" (UID: "029ea379-16dc-4fbe-9681-fbeb63bcc952"). InnerVolumeSpecName "kube-api-access-s566t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.090547 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data" (OuterVolumeSpecName: "config-data") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.114772 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "029ea379-16dc-4fbe-9681-fbeb63bcc952" (UID: "029ea379-16dc-4fbe-9681-fbeb63bcc952"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.127169 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "fb0abf6f-ac5a-4793-96ae-02221cbb00bf" (UID: "fb0abf6f-ac5a-4793-96ae-02221cbb00bf"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.152638 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data" (OuterVolumeSpecName: "config-data") pod "029ea379-16dc-4fbe-9681-fbeb63bcc952" (UID: "029ea379-16dc-4fbe-9681-fbeb63bcc952"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.154664 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s566t\" (UniqueName: \"kubernetes.io/projected/029ea379-16dc-4fbe-9681-fbeb63bcc952-kube-api-access-s566t\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.154689 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.154701 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.154711 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.154722 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.154732 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0abf6f-ac5a-4793-96ae-02221cbb00bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.154743 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.195567 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "029ea379-16dc-4fbe-9681-fbeb63bcc952" (UID: "029ea379-16dc-4fbe-9681-fbeb63bcc952"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.256719 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/029ea379-16dc-4fbe-9681-fbeb63bcc952-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.673982 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.681541 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.681749 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="ac41a509-04ac-4a2a-b365-61599c44f9da" containerName="watcher-decision-engine" containerID="cri-o://8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af" gracePeriod=30 Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.683967 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"fb0abf6f-ac5a-4793-96ae-02221cbb00bf","Type":"ContainerDied","Data":"e10af78f6577d45844503288f0d9431971a10bfed324fc3c8e69cd4e6b6ceb48"} Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.684023 4794 scope.go:117] "RemoveContainer" containerID="717081ba19ae34c169ea744a10b4b1b8d7b8db704b768e97238390df639b19d3" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.684158 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.693927 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder2114-account-delete-6l8nn" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.693950 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"029ea379-16dc-4fbe-9681-fbeb63bcc952","Type":"ContainerDied","Data":"41e0e1ca3d1c4ba7a5f01d9cdb8764248fdea6b7266cbb02b838995dee70e276"} Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.693934 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.719773 4794 scope.go:117] "RemoveContainer" containerID="d137d31e7c51b907fe8218663f3e8b3cf1a5402d9e65fd92c1666bd9dc760f2d" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.764565 4794 scope.go:117] "RemoveContainer" containerID="084d0f42b92f6ca4b6944a7d7a157c3785777c03e634e8cc99c61ad69c96cc19" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.790507 4794 scope.go:117] "RemoveContainer" containerID="d8a2efb48013fcf6f4b7db86c11b367946302f6715a1f5f607699047432aad81" Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.808677 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder2114-account-delete-6l8nn"] Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.813604 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder2114-account-delete-6l8nn"] Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.820383 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.828734 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.833179 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:24:14 crc kubenswrapper[4794]: I1215 14:24:14.842096 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Dec 15 14:24:15 crc kubenswrapper[4794]: I1215 14:24:15.797149 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:15 crc kubenswrapper[4794]: I1215 14:24:15.797752 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="ceilometer-central-agent" containerID="cri-o://a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb" gracePeriod=30 Dec 15 14:24:15 crc kubenswrapper[4794]: I1215 14:24:15.797869 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="proxy-httpd" containerID="cri-o://1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2" gracePeriod=30 Dec 15 14:24:15 crc kubenswrapper[4794]: I1215 14:24:15.797904 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="ceilometer-notification-agent" containerID="cri-o://c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1" gracePeriod=30 Dec 15 14:24:15 crc kubenswrapper[4794]: I1215 14:24:15.797983 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="sg-core" containerID="cri-o://8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b" gracePeriod=30 Dec 15 14:24:15 crc kubenswrapper[4794]: I1215 14:24:15.879073 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.567548 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.707170 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-ceilometer-tls-certs\") pod \"017b7b46-6faf-41b9-994b-3f30078d6086\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.707543 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-sg-core-conf-yaml\") pod \"017b7b46-6faf-41b9-994b-3f30078d6086\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.707646 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv5st\" (UniqueName: \"kubernetes.io/projected/017b7b46-6faf-41b9-994b-3f30078d6086-kube-api-access-rv5st\") pod \"017b7b46-6faf-41b9-994b-3f30078d6086\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.707672 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-run-httpd\") pod \"017b7b46-6faf-41b9-994b-3f30078d6086\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.707729 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-config-data\") pod \"017b7b46-6faf-41b9-994b-3f30078d6086\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.707887 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-log-httpd\") pod \"017b7b46-6faf-41b9-994b-3f30078d6086\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.707924 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-combined-ca-bundle\") pod \"017b7b46-6faf-41b9-994b-3f30078d6086\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.707966 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-scripts\") pod \"017b7b46-6faf-41b9-994b-3f30078d6086\" (UID: \"017b7b46-6faf-41b9-994b-3f30078d6086\") " Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.708171 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "017b7b46-6faf-41b9-994b-3f30078d6086" (UID: "017b7b46-6faf-41b9-994b-3f30078d6086"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.708476 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "017b7b46-6faf-41b9-994b-3f30078d6086" (UID: "017b7b46-6faf-41b9-994b-3f30078d6086"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.708618 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.712728 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/017b7b46-6faf-41b9-994b-3f30078d6086-kube-api-access-rv5st" (OuterVolumeSpecName: "kube-api-access-rv5st") pod "017b7b46-6faf-41b9-994b-3f30078d6086" (UID: "017b7b46-6faf-41b9-994b-3f30078d6086"). InnerVolumeSpecName "kube-api-access-rv5st". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.712856 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-scripts" (OuterVolumeSpecName: "scripts") pod "017b7b46-6faf-41b9-994b-3f30078d6086" (UID: "017b7b46-6faf-41b9-994b-3f30078d6086"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.734070 4794 generic.go:334] "Generic (PLEG): container finished" podID="017b7b46-6faf-41b9-994b-3f30078d6086" containerID="1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2" exitCode=0 Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.734357 4794 generic.go:334] "Generic (PLEG): container finished" podID="017b7b46-6faf-41b9-994b-3f30078d6086" containerID="8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b" exitCode=2 Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.734441 4794 generic.go:334] "Generic (PLEG): container finished" podID="017b7b46-6faf-41b9-994b-3f30078d6086" containerID="c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1" exitCode=0 Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.734516 4794 generic.go:334] "Generic (PLEG): container finished" podID="017b7b46-6faf-41b9-994b-3f30078d6086" containerID="a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb" exitCode=0 Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.734657 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"017b7b46-6faf-41b9-994b-3f30078d6086","Type":"ContainerDied","Data":"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2"} Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.734766 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"017b7b46-6faf-41b9-994b-3f30078d6086","Type":"ContainerDied","Data":"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b"} Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.734853 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"017b7b46-6faf-41b9-994b-3f30078d6086","Type":"ContainerDied","Data":"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1"} Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.734934 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"017b7b46-6faf-41b9-994b-3f30078d6086","Type":"ContainerDied","Data":"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb"} Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.735024 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"017b7b46-6faf-41b9-994b-3f30078d6086","Type":"ContainerDied","Data":"d8d9eab8c638cd1211f53f31b5b30368a011284df63a2db704996cabd052ee77"} Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.735112 4794 scope.go:117] "RemoveContainer" containerID="1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.735312 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.736614 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "017b7b46-6faf-41b9-994b-3f30078d6086" (UID: "017b7b46-6faf-41b9-994b-3f30078d6086"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.749172 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="029ea379-16dc-4fbe-9681-fbeb63bcc952" path="/var/lib/kubelet/pods/029ea379-16dc-4fbe-9681-fbeb63bcc952/volumes" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.750066 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5bc3d83-5920-4ab7-8800-e113d5e7ca6c" path="/var/lib/kubelet/pods/d5bc3d83-5920-4ab7-8800-e113d5e7ca6c/volumes" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.750489 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb0abf6f-ac5a-4793-96ae-02221cbb00bf" path="/var/lib/kubelet/pods/fb0abf6f-ac5a-4793-96ae-02221cbb00bf/volumes" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.760493 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "017b7b46-6faf-41b9-994b-3f30078d6086" (UID: "017b7b46-6faf-41b9-994b-3f30078d6086"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.766073 4794 scope.go:117] "RemoveContainer" containerID="8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.785718 4794 scope.go:117] "RemoveContainer" containerID="c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.791868 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "017b7b46-6faf-41b9-994b-3f30078d6086" (UID: "017b7b46-6faf-41b9-994b-3f30078d6086"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.809289 4794 scope.go:117] "RemoveContainer" containerID="a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.810396 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv5st\" (UniqueName: \"kubernetes.io/projected/017b7b46-6faf-41b9-994b-3f30078d6086-kube-api-access-rv5st\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.810441 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/017b7b46-6faf-41b9-994b-3f30078d6086-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.810458 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.810470 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.810482 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.810495 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.813640 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-config-data" (OuterVolumeSpecName: "config-data") pod "017b7b46-6faf-41b9-994b-3f30078d6086" (UID: "017b7b46-6faf-41b9-994b-3f30078d6086"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.826838 4794 scope.go:117] "RemoveContainer" containerID="1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2" Dec 15 14:24:16 crc kubenswrapper[4794]: E1215 14:24:16.829018 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2\": container with ID starting with 1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2 not found: ID does not exist" containerID="1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.829067 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2"} err="failed to get container status \"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2\": rpc error: code = NotFound desc = could not find container \"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2\": container with ID starting with 1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2 not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.829102 4794 scope.go:117] "RemoveContainer" containerID="8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b" Dec 15 14:24:16 crc kubenswrapper[4794]: E1215 14:24:16.830000 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b\": container with ID starting with 8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b not found: ID does not exist" containerID="8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.830036 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b"} err="failed to get container status \"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b\": rpc error: code = NotFound desc = could not find container \"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b\": container with ID starting with 8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.830057 4794 scope.go:117] "RemoveContainer" containerID="c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1" Dec 15 14:24:16 crc kubenswrapper[4794]: E1215 14:24:16.830413 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1\": container with ID starting with c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1 not found: ID does not exist" containerID="c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.830453 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1"} err="failed to get container status \"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1\": rpc error: code = NotFound desc = could not find container \"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1\": container with ID starting with c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1 not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.830470 4794 scope.go:117] "RemoveContainer" containerID="a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb" Dec 15 14:24:16 crc kubenswrapper[4794]: E1215 14:24:16.830783 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb\": container with ID starting with a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb not found: ID does not exist" containerID="a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.830811 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb"} err="failed to get container status \"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb\": rpc error: code = NotFound desc = could not find container \"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb\": container with ID starting with a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.830826 4794 scope.go:117] "RemoveContainer" containerID="1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.833074 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2"} err="failed to get container status \"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2\": rpc error: code = NotFound desc = could not find container \"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2\": container with ID starting with 1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2 not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.833138 4794 scope.go:117] "RemoveContainer" containerID="8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.833480 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b"} err="failed to get container status \"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b\": rpc error: code = NotFound desc = could not find container \"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b\": container with ID starting with 8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.833499 4794 scope.go:117] "RemoveContainer" containerID="c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.834192 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1"} err="failed to get container status \"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1\": rpc error: code = NotFound desc = could not find container \"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1\": container with ID starting with c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1 not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.834232 4794 scope.go:117] "RemoveContainer" containerID="a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.834532 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb"} err="failed to get container status \"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb\": rpc error: code = NotFound desc = could not find container \"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb\": container with ID starting with a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.834564 4794 scope.go:117] "RemoveContainer" containerID="1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.834848 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2"} err="failed to get container status \"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2\": rpc error: code = NotFound desc = could not find container \"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2\": container with ID starting with 1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2 not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.834870 4794 scope.go:117] "RemoveContainer" containerID="8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.836138 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b"} err="failed to get container status \"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b\": rpc error: code = NotFound desc = could not find container \"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b\": container with ID starting with 8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.836170 4794 scope.go:117] "RemoveContainer" containerID="c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.836444 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1"} err="failed to get container status \"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1\": rpc error: code = NotFound desc = could not find container \"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1\": container with ID starting with c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1 not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.836482 4794 scope.go:117] "RemoveContainer" containerID="a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.837396 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb"} err="failed to get container status \"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb\": rpc error: code = NotFound desc = could not find container \"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb\": container with ID starting with a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.837420 4794 scope.go:117] "RemoveContainer" containerID="1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.837653 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2"} err="failed to get container status \"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2\": rpc error: code = NotFound desc = could not find container \"1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2\": container with ID starting with 1d1702a2d1526c55e300520577250faa15419be2af66b4d7ceabc5e2b933edd2 not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.837670 4794 scope.go:117] "RemoveContainer" containerID="8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.837941 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b"} err="failed to get container status \"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b\": rpc error: code = NotFound desc = could not find container \"8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b\": container with ID starting with 8e16585c564ea3a1eb370176146eeb463e2b534bf13c5b8ff8aaa672f0ece44b not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.837968 4794 scope.go:117] "RemoveContainer" containerID="c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.838314 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1"} err="failed to get container status \"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1\": rpc error: code = NotFound desc = could not find container \"c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1\": container with ID starting with c20b9fb2671d0073202e38b00ba2fb8791dbddd6128f11304d8167948561a3e1 not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.838332 4794 scope.go:117] "RemoveContainer" containerID="a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.838662 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb"} err="failed to get container status \"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb\": rpc error: code = NotFound desc = could not find container \"a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb\": container with ID starting with a3e6a4782f0d9f2637fe103465217d4122719b2e71c94ba6f9ccc592b48e12eb not found: ID does not exist" Dec 15 14:24:16 crc kubenswrapper[4794]: I1215 14:24:16.911917 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017b7b46-6faf-41b9-994b-3f30078d6086-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.067455 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.091561 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.112990 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118357 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:17 crc kubenswrapper[4794]: E1215 14:24:17.118642 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="ceilometer-central-agent" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118651 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="ceilometer-central-agent" Dec 15 14:24:17 crc kubenswrapper[4794]: E1215 14:24:17.118666 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="sg-core" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118672 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="sg-core" Dec 15 14:24:17 crc kubenswrapper[4794]: E1215 14:24:17.118680 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="ceilometer-notification-agent" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118686 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="ceilometer-notification-agent" Dec 15 14:24:17 crc kubenswrapper[4794]: E1215 14:24:17.118694 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb0abf6f-ac5a-4793-96ae-02221cbb00bf" containerName="cinder-backup" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118700 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0abf6f-ac5a-4793-96ae-02221cbb00bf" containerName="cinder-backup" Dec 15 14:24:17 crc kubenswrapper[4794]: E1215 14:24:17.118713 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb0abf6f-ac5a-4793-96ae-02221cbb00bf" containerName="probe" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118718 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0abf6f-ac5a-4793-96ae-02221cbb00bf" containerName="probe" Dec 15 14:24:17 crc kubenswrapper[4794]: E1215 14:24:17.118729 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="proxy-httpd" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118734 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="proxy-httpd" Dec 15 14:24:17 crc kubenswrapper[4794]: E1215 14:24:17.118749 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="029ea379-16dc-4fbe-9681-fbeb63bcc952" containerName="cinder-scheduler" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118755 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="029ea379-16dc-4fbe-9681-fbeb63bcc952" containerName="cinder-scheduler" Dec 15 14:24:17 crc kubenswrapper[4794]: E1215 14:24:17.118771 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="029ea379-16dc-4fbe-9681-fbeb63bcc952" containerName="probe" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118777 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="029ea379-16dc-4fbe-9681-fbeb63bcc952" containerName="probe" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118921 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb0abf6f-ac5a-4793-96ae-02221cbb00bf" containerName="probe" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118930 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="ceilometer-notification-agent" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118945 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="proxy-httpd" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118952 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="sg-core" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118962 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="029ea379-16dc-4fbe-9681-fbeb63bcc952" containerName="probe" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118973 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb0abf6f-ac5a-4793-96ae-02221cbb00bf" containerName="cinder-backup" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118981 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="029ea379-16dc-4fbe-9681-fbeb63bcc952" containerName="cinder-scheduler" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.118993 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" containerName="ceilometer-central-agent" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.120336 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.158239 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.158730 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.158897 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.191430 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.216312 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbfk\" (UniqueName: \"kubernetes.io/projected/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-kube-api-access-ffbfk\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.216380 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.216411 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-config-data\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.216594 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-scripts\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.216637 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-log-httpd\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.216696 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.216828 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-run-httpd\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.216936 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.318486 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.318554 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-config-data\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.318619 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-scripts\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.318637 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-log-httpd\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.318672 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.318692 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-run-httpd\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.318720 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.319179 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-log-httpd\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.319227 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-run-httpd\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.318793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbfk\" (UniqueName: \"kubernetes.io/projected/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-kube-api-access-ffbfk\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.322705 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.322744 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.323450 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.323520 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-config-data\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.324232 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-scripts\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.343086 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbfk\" (UniqueName: \"kubernetes.io/projected/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-kube-api-access-ffbfk\") pod \"ceilometer-0\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.475198 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.939503 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:17 crc kubenswrapper[4794]: I1215 14:24:17.942415 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/cinder-api-0" podUID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.198:8776/healthcheck\": read tcp 10.217.0.2:51482->10.217.0.198:8776: read: connection reset by peer" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.260084 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.276007 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ac41a509-04ac-4a2a-b365-61599c44f9da/watcher-decision-engine/0.log" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.347740 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnwlc\" (UniqueName: \"kubernetes.io/projected/ac41a509-04ac-4a2a-b365-61599c44f9da-kube-api-access-cnwlc\") pod \"ac41a509-04ac-4a2a-b365-61599c44f9da\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.347789 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-custom-prometheus-ca\") pod \"ac41a509-04ac-4a2a-b365-61599c44f9da\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.347869 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-config-data\") pod \"ac41a509-04ac-4a2a-b365-61599c44f9da\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.347957 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac41a509-04ac-4a2a-b365-61599c44f9da-logs\") pod \"ac41a509-04ac-4a2a-b365-61599c44f9da\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.347985 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-cert-memcached-mtls\") pod \"ac41a509-04ac-4a2a-b365-61599c44f9da\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.348028 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-combined-ca-bundle\") pod \"ac41a509-04ac-4a2a-b365-61599c44f9da\" (UID: \"ac41a509-04ac-4a2a-b365-61599c44f9da\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.348548 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac41a509-04ac-4a2a-b365-61599c44f9da-logs" (OuterVolumeSpecName: "logs") pod "ac41a509-04ac-4a2a-b365-61599c44f9da" (UID: "ac41a509-04ac-4a2a-b365-61599c44f9da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.355500 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac41a509-04ac-4a2a-b365-61599c44f9da-kube-api-access-cnwlc" (OuterVolumeSpecName: "kube-api-access-cnwlc") pod "ac41a509-04ac-4a2a-b365-61599c44f9da" (UID: "ac41a509-04ac-4a2a-b365-61599c44f9da"). InnerVolumeSpecName "kube-api-access-cnwlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.357511 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.376028 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac41a509-04ac-4a2a-b365-61599c44f9da" (UID: "ac41a509-04ac-4a2a-b365-61599c44f9da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.376512 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "ac41a509-04ac-4a2a-b365-61599c44f9da" (UID: "ac41a509-04ac-4a2a-b365-61599c44f9da"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.447951 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-config-data" (OuterVolumeSpecName: "config-data") pod "ac41a509-04ac-4a2a-b365-61599c44f9da" (UID: "ac41a509-04ac-4a2a-b365-61599c44f9da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450106 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-combined-ca-bundle\") pod \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450202 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-scripts\") pod \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450274 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/113c107d-bfbc-43bd-bddb-401c41f0ac1b-logs\") pod \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450293 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-internal-tls-certs\") pod \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450320 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/113c107d-bfbc-43bd-bddb-401c41f0ac1b-etc-machine-id\") pod \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450342 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data\") pod \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450390 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-public-tls-certs\") pod \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450409 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-cert-memcached-mtls\") pod \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450448 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data-custom\") pod \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450488 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc2m4\" (UniqueName: \"kubernetes.io/projected/113c107d-bfbc-43bd-bddb-401c41f0ac1b-kube-api-access-zc2m4\") pod \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\" (UID: \"113c107d-bfbc-43bd-bddb-401c41f0ac1b\") " Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450878 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac41a509-04ac-4a2a-b365-61599c44f9da-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450896 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450908 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnwlc\" (UniqueName: \"kubernetes.io/projected/ac41a509-04ac-4a2a-b365-61599c44f9da-kube-api-access-cnwlc\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450918 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.450926 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.454652 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/113c107d-bfbc-43bd-bddb-401c41f0ac1b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "113c107d-bfbc-43bd-bddb-401c41f0ac1b" (UID: "113c107d-bfbc-43bd-bddb-401c41f0ac1b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.455016 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/113c107d-bfbc-43bd-bddb-401c41f0ac1b-logs" (OuterVolumeSpecName: "logs") pod "113c107d-bfbc-43bd-bddb-401c41f0ac1b" (UID: "113c107d-bfbc-43bd-bddb-401c41f0ac1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.455495 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113c107d-bfbc-43bd-bddb-401c41f0ac1b-kube-api-access-zc2m4" (OuterVolumeSpecName: "kube-api-access-zc2m4") pod "113c107d-bfbc-43bd-bddb-401c41f0ac1b" (UID: "113c107d-bfbc-43bd-bddb-401c41f0ac1b"). InnerVolumeSpecName "kube-api-access-zc2m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.457761 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-scripts" (OuterVolumeSpecName: "scripts") pod "113c107d-bfbc-43bd-bddb-401c41f0ac1b" (UID: "113c107d-bfbc-43bd-bddb-401c41f0ac1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.461272 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "113c107d-bfbc-43bd-bddb-401c41f0ac1b" (UID: "113c107d-bfbc-43bd-bddb-401c41f0ac1b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.482197 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "ac41a509-04ac-4a2a-b365-61599c44f9da" (UID: "ac41a509-04ac-4a2a-b365-61599c44f9da"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.523711 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data" (OuterVolumeSpecName: "config-data") pod "113c107d-bfbc-43bd-bddb-401c41f0ac1b" (UID: "113c107d-bfbc-43bd-bddb-401c41f0ac1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.552071 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ac41a509-04ac-4a2a-b365-61599c44f9da-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.552103 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.552114 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/113c107d-bfbc-43bd-bddb-401c41f0ac1b-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.552183 4794 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/113c107d-bfbc-43bd-bddb-401c41f0ac1b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.552213 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.552223 4794 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.552260 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc2m4\" (UniqueName: \"kubernetes.io/projected/113c107d-bfbc-43bd-bddb-401c41f0ac1b-kube-api-access-zc2m4\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.554684 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "113c107d-bfbc-43bd-bddb-401c41f0ac1b" (UID: "113c107d-bfbc-43bd-bddb-401c41f0ac1b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.558700 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "113c107d-bfbc-43bd-bddb-401c41f0ac1b" (UID: "113c107d-bfbc-43bd-bddb-401c41f0ac1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.565733 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "113c107d-bfbc-43bd-bddb-401c41f0ac1b" (UID: "113c107d-bfbc-43bd-bddb-401c41f0ac1b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.599750 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "113c107d-bfbc-43bd-bddb-401c41f0ac1b" (UID: "113c107d-bfbc-43bd-bddb-401c41f0ac1b"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.654088 4794 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.654558 4794 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.654657 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.654714 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113c107d-bfbc-43bd-bddb-401c41f0ac1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.742988 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:24:18 crc kubenswrapper[4794]: E1215 14:24:18.743206 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.748273 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="017b7b46-6faf-41b9-994b-3f30078d6086" path="/var/lib/kubelet/pods/017b7b46-6faf-41b9-994b-3f30078d6086/volumes" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.768332 4794 generic.go:334] "Generic (PLEG): container finished" podID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" containerID="eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91" exitCode=0 Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.768402 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"113c107d-bfbc-43bd-bddb-401c41f0ac1b","Type":"ContainerDied","Data":"eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91"} Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.768432 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"113c107d-bfbc-43bd-bddb-401c41f0ac1b","Type":"ContainerDied","Data":"a1a713c7dc815e72936f3152fb380272c9f8566167278d5867f3b64650a4e4f0"} Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.768438 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.768448 4794 scope.go:117] "RemoveContainer" containerID="eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.778399 4794 generic.go:334] "Generic (PLEG): container finished" podID="ac41a509-04ac-4a2a-b365-61599c44f9da" containerID="8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af" exitCode=0 Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.778477 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ac41a509-04ac-4a2a-b365-61599c44f9da","Type":"ContainerDied","Data":"8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af"} Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.778498 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ac41a509-04ac-4a2a-b365-61599c44f9da","Type":"ContainerDied","Data":"28c240872d9ecbb8af50de240ba1a2b08bcaf823d7ed87616331d5feb692fb54"} Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.778641 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.792052 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b3d4ec48-89a3-456c-824e-dc627d6f7b8b","Type":"ContainerStarted","Data":"8b81188a6c3372876f7095cce3c9d0c996587e2f85f32f3c14a2e88abed494c4"} Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.844944 4794 scope.go:117] "RemoveContainer" containerID="11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.854204 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.864644 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.875940 4794 scope.go:117] "RemoveContainer" containerID="eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91" Dec 15 14:24:18 crc kubenswrapper[4794]: E1215 14:24:18.876427 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91\": container with ID starting with eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91 not found: ID does not exist" containerID="eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.876455 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91"} err="failed to get container status \"eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91\": rpc error: code = NotFound desc = could not find container \"eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91\": container with ID starting with eff477a256bc0cabd8db467c8d6e3e1642a402d0c28537448aaddb3ede741e91 not found: ID does not exist" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.876478 4794 scope.go:117] "RemoveContainer" containerID="11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705" Dec 15 14:24:18 crc kubenswrapper[4794]: E1215 14:24:18.877251 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705\": container with ID starting with 11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705 not found: ID does not exist" containerID="11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.877397 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705"} err="failed to get container status \"11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705\": rpc error: code = NotFound desc = could not find container \"11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705\": container with ID starting with 11ae7b9eb885118aea6f5f70dcd4496178a0d4d12eaf829f4e0f2175b4350705 not found: ID does not exist" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.877429 4794 scope.go:117] "RemoveContainer" containerID="8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.880769 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.889901 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.896415 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:24:18 crc kubenswrapper[4794]: E1215 14:24:18.897092 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" containerName="cinder-api" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.897144 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" containerName="cinder-api" Dec 15 14:24:18 crc kubenswrapper[4794]: E1215 14:24:18.897172 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" containerName="cinder-api-log" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.897183 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" containerName="cinder-api-log" Dec 15 14:24:18 crc kubenswrapper[4794]: E1215 14:24:18.897202 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac41a509-04ac-4a2a-b365-61599c44f9da" containerName="watcher-decision-engine" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.897209 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac41a509-04ac-4a2a-b365-61599c44f9da" containerName="watcher-decision-engine" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.897396 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" containerName="cinder-api-log" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.897417 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" containerName="cinder-api" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.897426 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac41a509-04ac-4a2a-b365-61599c44f9da" containerName="watcher-decision-engine" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.898329 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.901840 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.905335 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.919951 4794 scope.go:117] "RemoveContainer" containerID="8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af" Dec 15 14:24:18 crc kubenswrapper[4794]: E1215 14:24:18.920468 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af\": container with ID starting with 8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af not found: ID does not exist" containerID="8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af" Dec 15 14:24:18 crc kubenswrapper[4794]: I1215 14:24:18.920525 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af"} err="failed to get container status \"8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af\": rpc error: code = NotFound desc = could not find container \"8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af\": container with ID starting with 8ac4d3304759eb1a1de1c8b0be373874bf8bcfac58d3d46693184314b7d015af not found: ID does not exist" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.061471 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nggxz\" (UniqueName: \"kubernetes.io/projected/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-kube-api-access-nggxz\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.061546 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.061644 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.061670 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.061698 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.061721 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.162685 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nggxz\" (UniqueName: \"kubernetes.io/projected/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-kube-api-access-nggxz\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.162750 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.162804 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.162825 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.162850 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.162874 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.163283 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.167417 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.167443 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.167500 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.179497 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.186034 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nggxz\" (UniqueName: \"kubernetes.io/projected/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-kube-api-access-nggxz\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.319953 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.772935 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.805748 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b3d4ec48-89a3-456c-824e-dc627d6f7b8b","Type":"ContainerStarted","Data":"96f0d2d088337a2d02cb54a747fccc0ed7b200211a4bc63186853ec7e93f0fe7"} Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.805968 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b3d4ec48-89a3-456c-824e-dc627d6f7b8b","Type":"ContainerStarted","Data":"bdfb1b3d0dc1223653a0d0e7df4e99efab19dcbc297ce932bd6833ebfec4ac3f"} Dec 15 14:24:19 crc kubenswrapper[4794]: I1215 14:24:19.807195 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f0acbc62-7836-4ea9-a0e4-7c58d7b41896","Type":"ContainerStarted","Data":"b5179ad6769cfb7e50f0310a33911cd37bf59fec557edaef4e1b90f19268f4dd"} Dec 15 14:24:20 crc kubenswrapper[4794]: I1215 14:24:20.770626 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="113c107d-bfbc-43bd-bddb-401c41f0ac1b" path="/var/lib/kubelet/pods/113c107d-bfbc-43bd-bddb-401c41f0ac1b/volumes" Dec 15 14:24:20 crc kubenswrapper[4794]: I1215 14:24:20.771561 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac41a509-04ac-4a2a-b365-61599c44f9da" path="/var/lib/kubelet/pods/ac41a509-04ac-4a2a-b365-61599c44f9da/volumes" Dec 15 14:24:20 crc kubenswrapper[4794]: I1215 14:24:20.817703 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b3d4ec48-89a3-456c-824e-dc627d6f7b8b","Type":"ContainerStarted","Data":"412604e407beaefae728966a59698b34b69e7ba443c1d1d6bc5d94491769ba1f"} Dec 15 14:24:20 crc kubenswrapper[4794]: I1215 14:24:20.819538 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f0acbc62-7836-4ea9-a0e4-7c58d7b41896","Type":"ContainerStarted","Data":"2320096567ba32a7e21ce8c838966d3e0229ba9e685f2be581c50c7824663282"} Dec 15 14:24:20 crc kubenswrapper[4794]: I1215 14:24:20.845506 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.845485064 podStartE2EDuration="2.845485064s" podCreationTimestamp="2025-12-15 14:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:24:20.838554247 +0000 UTC m=+1822.690576675" watchObservedRunningTime="2025-12-15 14:24:20.845485064 +0000 UTC m=+1822.697507502" Dec 15 14:24:21 crc kubenswrapper[4794]: I1215 14:24:21.043347 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-7b39-account-create-l4qsh"] Dec 15 14:24:21 crc kubenswrapper[4794]: I1215 14:24:21.054895 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-7b39-account-create-l4qsh"] Dec 15 14:24:21 crc kubenswrapper[4794]: I1215 14:24:21.749407 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_f0acbc62-7836-4ea9-a0e4-7c58d7b41896/watcher-decision-engine/0.log" Dec 15 14:24:21 crc kubenswrapper[4794]: I1215 14:24:21.829279 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b3d4ec48-89a3-456c-824e-dc627d6f7b8b","Type":"ContainerStarted","Data":"e02f1bc71b12e0a0f7e2689f6a19695a7e1ad44e9b58769c45acfbf78202e40e"} Dec 15 14:24:21 crc kubenswrapper[4794]: I1215 14:24:21.854227 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.269703418 podStartE2EDuration="4.854206591s" podCreationTimestamp="2025-12-15 14:24:17 +0000 UTC" firstStartedPulling="2025-12-15 14:24:17.952714347 +0000 UTC m=+1819.804736785" lastFinishedPulling="2025-12-15 14:24:21.53721752 +0000 UTC m=+1823.389239958" observedRunningTime="2025-12-15 14:24:21.847341787 +0000 UTC m=+1823.699364255" watchObservedRunningTime="2025-12-15 14:24:21.854206591 +0000 UTC m=+1823.706229029" Dec 15 14:24:22 crc kubenswrapper[4794]: I1215 14:24:22.748847 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2dc9c2-0d41-4fa9-9104-7fe7854a0989" path="/var/lib/kubelet/pods/dd2dc9c2-0d41-4fa9-9104-7fe7854a0989/volumes" Dec 15 14:24:22 crc kubenswrapper[4794]: I1215 14:24:22.837942 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:22 crc kubenswrapper[4794]: I1215 14:24:22.954263 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_f0acbc62-7836-4ea9-a0e4-7c58d7b41896/watcher-decision-engine/0.log" Dec 15 14:24:24 crc kubenswrapper[4794]: I1215 14:24:24.151827 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_f0acbc62-7836-4ea9-a0e4-7c58d7b41896/watcher-decision-engine/0.log" Dec 15 14:24:25 crc kubenswrapper[4794]: I1215 14:24:25.347521 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_f0acbc62-7836-4ea9-a0e4-7c58d7b41896/watcher-decision-engine/0.log" Dec 15 14:24:26 crc kubenswrapper[4794]: I1215 14:24:26.578891 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_f0acbc62-7836-4ea9-a0e4-7c58d7b41896/watcher-decision-engine/0.log" Dec 15 14:24:27 crc kubenswrapper[4794]: I1215 14:24:27.797514 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_f0acbc62-7836-4ea9-a0e4-7c58d7b41896/watcher-decision-engine/0.log" Dec 15 14:24:28 crc kubenswrapper[4794]: I1215 14:24:28.991419 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_f0acbc62-7836-4ea9-a0e4-7c58d7b41896/watcher-decision-engine/0.log" Dec 15 14:24:29 crc kubenswrapper[4794]: I1215 14:24:29.320779 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:29 crc kubenswrapper[4794]: I1215 14:24:29.347053 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:29 crc kubenswrapper[4794]: I1215 14:24:29.910423 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:29 crc kubenswrapper[4794]: I1215 14:24:29.948282 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:30 crc kubenswrapper[4794]: I1215 14:24:30.160258 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_f0acbc62-7836-4ea9-a0e4-7c58d7b41896/watcher-decision-engine/0.log" Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.345330 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_f0acbc62-7836-4ea9-a0e4-7c58d7b41896/watcher-decision-engine/0.log" Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.466646 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-42r2l"] Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.476216 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-42r2l"] Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.523894 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcherb809-account-delete-g2d6r"] Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.524922 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb809-account-delete-g2d6r" Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.554787 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.555079 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="04bdfdbc-2a4d-4081-9230-c0307b4cca8c" containerName="watcher-applier" containerID="cri-o://fc1915a2c9d64249d0403fc3d3360dfd376267238a8342e02c2e8c7e1beaf88e" gracePeriod=30 Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.577365 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcherb809-account-delete-g2d6r"] Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.597718 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-b809-account-create-kqj5f"] Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.603037 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcherb809-account-delete-g2d6r"] Dec 15 14:24:31 crc kubenswrapper[4794]: E1215 14:24:31.603630 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-7jtj5], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watcherb809-account-delete-g2d6r" podUID="ee666d91-c61a-428f-a2ad-84ee07e0ee60" Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.631027 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-b809-account-create-kqj5f"] Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.638965 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-sg2tp"] Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.646485 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-sg2tp"] Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.660912 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.661234 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="e9eb4ad7-832e-40ab-a921-f27b230508d6" containerName="watcher-kuttl-api-log" containerID="cri-o://166a2f319423ec5d991e29434c83f2b1a9286046f8bec2da27ccee4e674d4c8e" gracePeriod=30 Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.661711 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="e9eb4ad7-832e-40ab-a921-f27b230508d6" containerName="watcher-api" containerID="cri-o://59443a897f6786ee1d6df85c2f05a38fca1b98c88d7bc6f4ad5ee4da01581962" gracePeriod=30 Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.667495 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.678984 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jtj5\" (UniqueName: \"kubernetes.io/projected/ee666d91-c61a-428f-a2ad-84ee07e0ee60-kube-api-access-7jtj5\") pod \"watcherb809-account-delete-g2d6r\" (UID: \"ee666d91-c61a-428f-a2ad-84ee07e0ee60\") " pod="watcher-kuttl-default/watcherb809-account-delete-g2d6r" Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.737361 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:24:31 crc kubenswrapper[4794]: E1215 14:24:31.737599 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.780375 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jtj5\" (UniqueName: \"kubernetes.io/projected/ee666d91-c61a-428f-a2ad-84ee07e0ee60-kube-api-access-7jtj5\") pod \"watcherb809-account-delete-g2d6r\" (UID: \"ee666d91-c61a-428f-a2ad-84ee07e0ee60\") " pod="watcher-kuttl-default/watcherb809-account-delete-g2d6r" Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.812151 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jtj5\" (UniqueName: \"kubernetes.io/projected/ee666d91-c61a-428f-a2ad-84ee07e0ee60-kube-api-access-7jtj5\") pod \"watcherb809-account-delete-g2d6r\" (UID: \"ee666d91-c61a-428f-a2ad-84ee07e0ee60\") " pod="watcher-kuttl-default/watcherb809-account-delete-g2d6r" Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.927173 4794 generic.go:334] "Generic (PLEG): container finished" podID="e9eb4ad7-832e-40ab-a921-f27b230508d6" containerID="166a2f319423ec5d991e29434c83f2b1a9286046f8bec2da27ccee4e674d4c8e" exitCode=143 Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.927256 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb809-account-delete-g2d6r" Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.927290 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e9eb4ad7-832e-40ab-a921-f27b230508d6","Type":"ContainerDied","Data":"166a2f319423ec5d991e29434c83f2b1a9286046f8bec2da27ccee4e674d4c8e"} Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.927995 4794 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-89trk\" not found" Dec 15 14:24:31 crc kubenswrapper[4794]: I1215 14:24:31.937757 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb809-account-delete-g2d6r" Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.085158 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jtj5\" (UniqueName: \"kubernetes.io/projected/ee666d91-c61a-428f-a2ad-84ee07e0ee60-kube-api-access-7jtj5\") pod \"ee666d91-c61a-428f-a2ad-84ee07e0ee60\" (UID: \"ee666d91-c61a-428f-a2ad-84ee07e0ee60\") " Dec 15 14:24:32 crc kubenswrapper[4794]: E1215 14:24:32.085954 4794 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 15 14:24:32 crc kubenswrapper[4794]: E1215 14:24:32.086051 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data podName:f0acbc62-7836-4ea9-a0e4-7c58d7b41896 nodeName:}" failed. No retries permitted until 2025-12-15 14:24:32.586029835 +0000 UTC m=+1834.438052353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "f0acbc62-7836-4ea9-a0e4-7c58d7b41896") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.088315 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee666d91-c61a-428f-a2ad-84ee07e0ee60-kube-api-access-7jtj5" (OuterVolumeSpecName: "kube-api-access-7jtj5") pod "ee666d91-c61a-428f-a2ad-84ee07e0ee60" (UID: "ee666d91-c61a-428f-a2ad-84ee07e0ee60"). InnerVolumeSpecName "kube-api-access-7jtj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.187387 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jtj5\" (UniqueName: \"kubernetes.io/projected/ee666d91-c61a-428f-a2ad-84ee07e0ee60-kube-api-access-7jtj5\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:32 crc kubenswrapper[4794]: E1215 14:24:32.594099 4794 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 15 14:24:32 crc kubenswrapper[4794]: E1215 14:24:32.594167 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data podName:f0acbc62-7836-4ea9-a0e4-7c58d7b41896 nodeName:}" failed. No retries permitted until 2025-12-15 14:24:33.594152276 +0000 UTC m=+1835.446174714 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "f0acbc62-7836-4ea9-a0e4-7c58d7b41896") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.745791 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08415ca6-d8d9-479b-be5e-b2ff38e381d4" path="/var/lib/kubelet/pods/08415ca6-d8d9-479b-be5e-b2ff38e381d4/volumes" Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.746267 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14db7f83-7c85-4926-a966-bad5401ac7c9" path="/var/lib/kubelet/pods/14db7f83-7c85-4926-a966-bad5401ac7c9/volumes" Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.746768 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d51a90b6-87a0-4521-8cd2-0689e1c35405" path="/var/lib/kubelet/pods/d51a90b6-87a0-4521-8cd2-0689e1c35405/volumes" Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.935538 4794 generic.go:334] "Generic (PLEG): container finished" podID="04bdfdbc-2a4d-4081-9230-c0307b4cca8c" containerID="fc1915a2c9d64249d0403fc3d3360dfd376267238a8342e02c2e8c7e1beaf88e" exitCode=0 Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.935672 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"04bdfdbc-2a4d-4081-9230-c0307b4cca8c","Type":"ContainerDied","Data":"fc1915a2c9d64249d0403fc3d3360dfd376267238a8342e02c2e8c7e1beaf88e"} Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.938068 4794 generic.go:334] "Generic (PLEG): container finished" podID="e9eb4ad7-832e-40ab-a921-f27b230508d6" containerID="59443a897f6786ee1d6df85c2f05a38fca1b98c88d7bc6f4ad5ee4da01581962" exitCode=0 Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.938151 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e9eb4ad7-832e-40ab-a921-f27b230508d6","Type":"ContainerDied","Data":"59443a897f6786ee1d6df85c2f05a38fca1b98c88d7bc6f4ad5ee4da01581962"} Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.938268 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f0acbc62-7836-4ea9-a0e4-7c58d7b41896" containerName="watcher-decision-engine" containerID="cri-o://2320096567ba32a7e21ce8c838966d3e0229ba9e685f2be581c50c7824663282" gracePeriod=30 Dec 15 14:24:32 crc kubenswrapper[4794]: I1215 14:24:32.938285 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb809-account-delete-g2d6r" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.019490 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcherb809-account-delete-g2d6r"] Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.026108 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcherb809-account-delete-g2d6r"] Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.280880 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.413970 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-cert-memcached-mtls\") pod \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.414398 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-logs\") pod \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.414712 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-combined-ca-bundle\") pod \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.414773 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhfdv\" (UniqueName: \"kubernetes.io/projected/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-kube-api-access-mhfdv\") pod \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.414823 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-config-data\") pod \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\" (UID: \"04bdfdbc-2a4d-4081-9230-c0307b4cca8c\") " Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.415204 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-logs" (OuterVolumeSpecName: "logs") pod "04bdfdbc-2a4d-4081-9230-c0307b4cca8c" (UID: "04bdfdbc-2a4d-4081-9230-c0307b4cca8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.415358 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.419871 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-kube-api-access-mhfdv" (OuterVolumeSpecName: "kube-api-access-mhfdv") pod "04bdfdbc-2a4d-4081-9230-c0307b4cca8c" (UID: "04bdfdbc-2a4d-4081-9230-c0307b4cca8c"). InnerVolumeSpecName "kube-api-access-mhfdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.441136 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04bdfdbc-2a4d-4081-9230-c0307b4cca8c" (UID: "04bdfdbc-2a4d-4081-9230-c0307b4cca8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.468790 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-config-data" (OuterVolumeSpecName: "config-data") pod "04bdfdbc-2a4d-4081-9230-c0307b4cca8c" (UID: "04bdfdbc-2a4d-4081-9230-c0307b4cca8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.489804 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "04bdfdbc-2a4d-4081-9230-c0307b4cca8c" (UID: "04bdfdbc-2a4d-4081-9230-c0307b4cca8c"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.516599 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.516634 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.516647 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhfdv\" (UniqueName: \"kubernetes.io/projected/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-kube-api-access-mhfdv\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.516660 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bdfdbc-2a4d-4081-9230-c0307b4cca8c-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.520533 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.617932 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-cert-memcached-mtls\") pod \"e9eb4ad7-832e-40ab-a921-f27b230508d6\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.617979 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-combined-ca-bundle\") pod \"e9eb4ad7-832e-40ab-a921-f27b230508d6\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.618012 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcvsp\" (UniqueName: \"kubernetes.io/projected/e9eb4ad7-832e-40ab-a921-f27b230508d6-kube-api-access-jcvsp\") pod \"e9eb4ad7-832e-40ab-a921-f27b230508d6\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.618089 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-config-data\") pod \"e9eb4ad7-832e-40ab-a921-f27b230508d6\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.618112 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-custom-prometheus-ca\") pod \"e9eb4ad7-832e-40ab-a921-f27b230508d6\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.618179 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9eb4ad7-832e-40ab-a921-f27b230508d6-logs\") pod \"e9eb4ad7-832e-40ab-a921-f27b230508d6\" (UID: \"e9eb4ad7-832e-40ab-a921-f27b230508d6\") " Dec 15 14:24:33 crc kubenswrapper[4794]: E1215 14:24:33.618700 4794 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 15 14:24:33 crc kubenswrapper[4794]: E1215 14:24:33.618746 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data podName:f0acbc62-7836-4ea9-a0e4-7c58d7b41896 nodeName:}" failed. No retries permitted until 2025-12-15 14:24:35.618730612 +0000 UTC m=+1837.470753050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "f0acbc62-7836-4ea9-a0e4-7c58d7b41896") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.622286 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9eb4ad7-832e-40ab-a921-f27b230508d6-logs" (OuterVolumeSpecName: "logs") pod "e9eb4ad7-832e-40ab-a921-f27b230508d6" (UID: "e9eb4ad7-832e-40ab-a921-f27b230508d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.626799 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9eb4ad7-832e-40ab-a921-f27b230508d6-kube-api-access-jcvsp" (OuterVolumeSpecName: "kube-api-access-jcvsp") pod "e9eb4ad7-832e-40ab-a921-f27b230508d6" (UID: "e9eb4ad7-832e-40ab-a921-f27b230508d6"). InnerVolumeSpecName "kube-api-access-jcvsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.647069 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "e9eb4ad7-832e-40ab-a921-f27b230508d6" (UID: "e9eb4ad7-832e-40ab-a921-f27b230508d6"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.652521 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9eb4ad7-832e-40ab-a921-f27b230508d6" (UID: "e9eb4ad7-832e-40ab-a921-f27b230508d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.668686 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-config-data" (OuterVolumeSpecName: "config-data") pod "e9eb4ad7-832e-40ab-a921-f27b230508d6" (UID: "e9eb4ad7-832e-40ab-a921-f27b230508d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.683386 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "e9eb4ad7-832e-40ab-a921-f27b230508d6" (UID: "e9eb4ad7-832e-40ab-a921-f27b230508d6"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.719749 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9eb4ad7-832e-40ab-a921-f27b230508d6-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.719781 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.719792 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.719825 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcvsp\" (UniqueName: \"kubernetes.io/projected/e9eb4ad7-832e-40ab-a921-f27b230508d6-kube-api-access-jcvsp\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.719834 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.719843 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e9eb4ad7-832e-40ab-a921-f27b230508d6-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.948944 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e9eb4ad7-832e-40ab-a921-f27b230508d6","Type":"ContainerDied","Data":"56ccc6fa6937a1596586de9120b064a682cc48cc5672cf52b03e13b5cd61c3af"} Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.948980 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.949000 4794 scope.go:117] "RemoveContainer" containerID="59443a897f6786ee1d6df85c2f05a38fca1b98c88d7bc6f4ad5ee4da01581962" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.951169 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"04bdfdbc-2a4d-4081-9230-c0307b4cca8c","Type":"ContainerDied","Data":"27ed47058025b5ffce7a60f8657a938815f1f0bdb1920079e3e71f7da0034a78"} Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.951240 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.969181 4794 scope.go:117] "RemoveContainer" containerID="166a2f319423ec5d991e29434c83f2b1a9286046f8bec2da27ccee4e674d4c8e" Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.988244 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:24:33 crc kubenswrapper[4794]: I1215 14:24:33.997928 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.005198 4794 scope.go:117] "RemoveContainer" containerID="fc1915a2c9d64249d0403fc3d3360dfd376267238a8342e02c2e8c7e1beaf88e" Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.005852 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.012533 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.174207 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.174493 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="ceilometer-central-agent" containerID="cri-o://bdfb1b3d0dc1223653a0d0e7df4e99efab19dcbc297ce932bd6833ebfec4ac3f" gracePeriod=30 Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.174637 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="proxy-httpd" containerID="cri-o://e02f1bc71b12e0a0f7e2689f6a19695a7e1ad44e9b58769c45acfbf78202e40e" gracePeriod=30 Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.174678 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="sg-core" containerID="cri-o://412604e407beaefae728966a59698b34b69e7ba443c1d1d6bc5d94491769ba1f" gracePeriod=30 Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.174709 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="ceilometer-notification-agent" containerID="cri-o://96f0d2d088337a2d02cb54a747fccc0ed7b200211a4bc63186853ec7e93f0fe7" gracePeriod=30 Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.189698 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.748110 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04bdfdbc-2a4d-4081-9230-c0307b4cca8c" path="/var/lib/kubelet/pods/04bdfdbc-2a4d-4081-9230-c0307b4cca8c/volumes" Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.748771 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9eb4ad7-832e-40ab-a921-f27b230508d6" path="/var/lib/kubelet/pods/e9eb4ad7-832e-40ab-a921-f27b230508d6/volumes" Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.749353 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee666d91-c61a-428f-a2ad-84ee07e0ee60" path="/var/lib/kubelet/pods/ee666d91-c61a-428f-a2ad-84ee07e0ee60/volumes" Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.962868 4794 generic.go:334] "Generic (PLEG): container finished" podID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerID="e02f1bc71b12e0a0f7e2689f6a19695a7e1ad44e9b58769c45acfbf78202e40e" exitCode=0 Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.962905 4794 generic.go:334] "Generic (PLEG): container finished" podID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerID="412604e407beaefae728966a59698b34b69e7ba443c1d1d6bc5d94491769ba1f" exitCode=2 Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.962917 4794 generic.go:334] "Generic (PLEG): container finished" podID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerID="bdfb1b3d0dc1223653a0d0e7df4e99efab19dcbc297ce932bd6833ebfec4ac3f" exitCode=0 Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.962936 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b3d4ec48-89a3-456c-824e-dc627d6f7b8b","Type":"ContainerDied","Data":"e02f1bc71b12e0a0f7e2689f6a19695a7e1ad44e9b58769c45acfbf78202e40e"} Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.962967 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b3d4ec48-89a3-456c-824e-dc627d6f7b8b","Type":"ContainerDied","Data":"412604e407beaefae728966a59698b34b69e7ba443c1d1d6bc5d94491769ba1f"} Dec 15 14:24:34 crc kubenswrapper[4794]: I1215 14:24:34.962979 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b3d4ec48-89a3-456c-824e-dc627d6f7b8b","Type":"ContainerDied","Data":"bdfb1b3d0dc1223653a0d0e7df4e99efab19dcbc297ce932bd6833ebfec4ac3f"} Dec 15 14:24:35 crc kubenswrapper[4794]: E1215 14:24:35.651114 4794 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Dec 15 14:24:35 crc kubenswrapper[4794]: E1215 14:24:35.651195 4794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data podName:f0acbc62-7836-4ea9-a0e4-7c58d7b41896 nodeName:}" failed. No retries permitted until 2025-12-15 14:24:39.651176251 +0000 UTC m=+1841.503198679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "f0acbc62-7836-4ea9-a0e4-7c58d7b41896") : secret "watcher-kuttl-decision-engine-config-data" not found Dec 15 14:24:35 crc kubenswrapper[4794]: I1215 14:24:35.973896 4794 generic.go:334] "Generic (PLEG): container finished" podID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerID="96f0d2d088337a2d02cb54a747fccc0ed7b200211a4bc63186853ec7e93f0fe7" exitCode=0 Dec 15 14:24:35 crc kubenswrapper[4794]: I1215 14:24:35.973938 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b3d4ec48-89a3-456c-824e-dc627d6f7b8b","Type":"ContainerDied","Data":"96f0d2d088337a2d02cb54a747fccc0ed7b200211a4bc63186853ec7e93f0fe7"} Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.487802 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.666266 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffbfk\" (UniqueName: \"kubernetes.io/projected/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-kube-api-access-ffbfk\") pod \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.666356 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-scripts\") pod \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.666401 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-combined-ca-bundle\") pod \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.666437 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-sg-core-conf-yaml\") pod \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.666461 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-config-data\") pod \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.666493 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-ceilometer-tls-certs\") pod \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.666561 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-run-httpd\") pod \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.666695 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-log-httpd\") pod \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\" (UID: \"b3d4ec48-89a3-456c-824e-dc627d6f7b8b\") " Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.667377 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b3d4ec48-89a3-456c-824e-dc627d6f7b8b" (UID: "b3d4ec48-89a3-456c-824e-dc627d6f7b8b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.672059 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b3d4ec48-89a3-456c-824e-dc627d6f7b8b" (UID: "b3d4ec48-89a3-456c-824e-dc627d6f7b8b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.675607 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-scripts" (OuterVolumeSpecName: "scripts") pod "b3d4ec48-89a3-456c-824e-dc627d6f7b8b" (UID: "b3d4ec48-89a3-456c-824e-dc627d6f7b8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.677126 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-kube-api-access-ffbfk" (OuterVolumeSpecName: "kube-api-access-ffbfk") pod "b3d4ec48-89a3-456c-824e-dc627d6f7b8b" (UID: "b3d4ec48-89a3-456c-824e-dc627d6f7b8b"). InnerVolumeSpecName "kube-api-access-ffbfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.698796 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b3d4ec48-89a3-456c-824e-dc627d6f7b8b" (UID: "b3d4ec48-89a3-456c-824e-dc627d6f7b8b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.710306 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b3d4ec48-89a3-456c-824e-dc627d6f7b8b" (UID: "b3d4ec48-89a3-456c-824e-dc627d6f7b8b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.751308 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3d4ec48-89a3-456c-824e-dc627d6f7b8b" (UID: "b3d4ec48-89a3-456c-824e-dc627d6f7b8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.756547 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-config-data" (OuterVolumeSpecName: "config-data") pod "b3d4ec48-89a3-456c-824e-dc627d6f7b8b" (UID: "b3d4ec48-89a3-456c-824e-dc627d6f7b8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.768941 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.768970 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffbfk\" (UniqueName: \"kubernetes.io/projected/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-kube-api-access-ffbfk\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.768981 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.768990 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.769001 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.769010 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.769019 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.769028 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ec48-89a3-456c-824e-dc627d6f7b8b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.984726 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"b3d4ec48-89a3-456c-824e-dc627d6f7b8b","Type":"ContainerDied","Data":"8b81188a6c3372876f7095cce3c9d0c996587e2f85f32f3c14a2e88abed494c4"} Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.984771 4794 scope.go:117] "RemoveContainer" containerID="e02f1bc71b12e0a0f7e2689f6a19695a7e1ad44e9b58769c45acfbf78202e40e" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.984869 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.987975 4794 generic.go:334] "Generic (PLEG): container finished" podID="f0acbc62-7836-4ea9-a0e4-7c58d7b41896" containerID="2320096567ba32a7e21ce8c838966d3e0229ba9e685f2be581c50c7824663282" exitCode=0 Dec 15 14:24:36 crc kubenswrapper[4794]: I1215 14:24:36.988013 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f0acbc62-7836-4ea9-a0e4-7c58d7b41896","Type":"ContainerDied","Data":"2320096567ba32a7e21ce8c838966d3e0229ba9e685f2be581c50c7824663282"} Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.020813 4794 scope.go:117] "RemoveContainer" containerID="412604e407beaefae728966a59698b34b69e7ba443c1d1d6bc5d94491769ba1f" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.064486 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.072428 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080011 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:37 crc kubenswrapper[4794]: E1215 14:24:37.080401 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="ceilometer-central-agent" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080418 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="ceilometer-central-agent" Dec 15 14:24:37 crc kubenswrapper[4794]: E1215 14:24:37.080429 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9eb4ad7-832e-40ab-a921-f27b230508d6" containerName="watcher-api" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080437 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9eb4ad7-832e-40ab-a921-f27b230508d6" containerName="watcher-api" Dec 15 14:24:37 crc kubenswrapper[4794]: E1215 14:24:37.080456 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="ceilometer-notification-agent" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080463 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="ceilometer-notification-agent" Dec 15 14:24:37 crc kubenswrapper[4794]: E1215 14:24:37.080479 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="sg-core" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080486 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="sg-core" Dec 15 14:24:37 crc kubenswrapper[4794]: E1215 14:24:37.080497 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="proxy-httpd" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080505 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="proxy-httpd" Dec 15 14:24:37 crc kubenswrapper[4794]: E1215 14:24:37.080513 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04bdfdbc-2a4d-4081-9230-c0307b4cca8c" containerName="watcher-applier" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080520 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="04bdfdbc-2a4d-4081-9230-c0307b4cca8c" containerName="watcher-applier" Dec 15 14:24:37 crc kubenswrapper[4794]: E1215 14:24:37.080544 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9eb4ad7-832e-40ab-a921-f27b230508d6" containerName="watcher-kuttl-api-log" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080552 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9eb4ad7-832e-40ab-a921-f27b230508d6" containerName="watcher-kuttl-api-log" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080811 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="ceilometer-notification-agent" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080825 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="sg-core" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080837 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="04bdfdbc-2a4d-4081-9230-c0307b4cca8c" containerName="watcher-applier" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080854 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="proxy-httpd" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080866 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9eb4ad7-832e-40ab-a921-f27b230508d6" containerName="watcher-kuttl-api-log" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080877 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" containerName="ceilometer-central-agent" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.080891 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9eb4ad7-832e-40ab-a921-f27b230508d6" containerName="watcher-api" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.084142 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.086482 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.086663 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.086899 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.087043 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.126596 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.137061 4794 scope.go:117] "RemoveContainer" containerID="96f0d2d088337a2d02cb54a747fccc0ed7b200211a4bc63186853ec7e93f0fe7" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.174170 4794 scope.go:117] "RemoveContainer" containerID="bdfb1b3d0dc1223653a0d0e7df4e99efab19dcbc297ce932bd6833ebfec4ac3f" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.186839 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-config-data\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.186977 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-run-httpd\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.187072 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.187163 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-scripts\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.187229 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.187257 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.187288 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhgl\" (UniqueName: \"kubernetes.io/projected/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-kube-api-access-vjhgl\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.187310 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-log-httpd\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.288797 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-custom-prometheus-ca\") pod \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.288899 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nggxz\" (UniqueName: \"kubernetes.io/projected/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-kube-api-access-nggxz\") pod \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.288958 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-logs\") pod \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.289020 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-combined-ca-bundle\") pod \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.289049 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data\") pod \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.289150 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-cert-memcached-mtls\") pod \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\" (UID: \"f0acbc62-7836-4ea9-a0e4-7c58d7b41896\") " Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.289382 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.289409 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.289442 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhgl\" (UniqueName: \"kubernetes.io/projected/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-kube-api-access-vjhgl\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.289464 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-log-httpd\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.289493 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-config-data\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.289538 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-run-httpd\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.289608 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.289672 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-scripts\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.290660 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-logs" (OuterVolumeSpecName: "logs") pod "f0acbc62-7836-4ea9-a0e4-7c58d7b41896" (UID: "f0acbc62-7836-4ea9-a0e4-7c58d7b41896"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.294231 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-log-httpd\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.294685 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-run-httpd\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.321896 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.322183 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-config-data\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.323785 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.323935 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-kube-api-access-nggxz" (OuterVolumeSpecName: "kube-api-access-nggxz") pod "f0acbc62-7836-4ea9-a0e4-7c58d7b41896" (UID: "f0acbc62-7836-4ea9-a0e4-7c58d7b41896"). InnerVolumeSpecName "kube-api-access-nggxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.326108 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhgl\" (UniqueName: \"kubernetes.io/projected/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-kube-api-access-vjhgl\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.326437 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.330938 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-scripts\") pod \"ceilometer-0\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.360756 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "f0acbc62-7836-4ea9-a0e4-7c58d7b41896" (UID: "f0acbc62-7836-4ea9-a0e4-7c58d7b41896"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.364390 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0acbc62-7836-4ea9-a0e4-7c58d7b41896" (UID: "f0acbc62-7836-4ea9-a0e4-7c58d7b41896"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.388173 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "f0acbc62-7836-4ea9-a0e4-7c58d7b41896" (UID: "f0acbc62-7836-4ea9-a0e4-7c58d7b41896"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.391137 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.391167 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nggxz\" (UniqueName: \"kubernetes.io/projected/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-kube-api-access-nggxz\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.391177 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.391189 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.391197 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.392746 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data" (OuterVolumeSpecName: "config-data") pod "f0acbc62-7836-4ea9-a0e4-7c58d7b41896" (UID: "f0acbc62-7836-4ea9-a0e4-7c58d7b41896"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.438149 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.493103 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0acbc62-7836-4ea9-a0e4-7c58d7b41896-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.855822 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:24:37 crc kubenswrapper[4794]: W1215 14:24:37.866747 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b5a9e36_b184_41cc_855a_d8ef2933ceb3.slice/crio-49c31adc61986a31fefb6b79a924693a4f7eef6ccba1bfbf69687ca58a411723 WatchSource:0}: Error finding container 49c31adc61986a31fefb6b79a924693a4f7eef6ccba1bfbf69687ca58a411723: Status 404 returned error can't find the container with id 49c31adc61986a31fefb6b79a924693a4f7eef6ccba1bfbf69687ca58a411723 Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.998099 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f0acbc62-7836-4ea9-a0e4-7c58d7b41896","Type":"ContainerDied","Data":"b5179ad6769cfb7e50f0310a33911cd37bf59fec557edaef4e1b90f19268f4dd"} Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.998157 4794 scope.go:117] "RemoveContainer" containerID="2320096567ba32a7e21ce8c838966d3e0229ba9e685f2be581c50c7824663282" Dec 15 14:24:37 crc kubenswrapper[4794]: I1215 14:24:37.998259 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.003472 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3b5a9e36-b184-41cc-855a-d8ef2933ceb3","Type":"ContainerStarted","Data":"49c31adc61986a31fefb6b79a924693a4f7eef6ccba1bfbf69687ca58a411723"} Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.040090 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.055470 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.523399 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-5ll8g"] Dec 15 14:24:38 crc kubenswrapper[4794]: E1215 14:24:38.524183 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0acbc62-7836-4ea9-a0e4-7c58d7b41896" containerName="watcher-decision-engine" Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.524202 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0acbc62-7836-4ea9-a0e4-7c58d7b41896" containerName="watcher-decision-engine" Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.524373 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0acbc62-7836-4ea9-a0e4-7c58d7b41896" containerName="watcher-decision-engine" Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.525042 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5ll8g" Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.533303 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5ll8g"] Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.612319 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-672sq\" (UniqueName: \"kubernetes.io/projected/723627c3-a7ec-4694-80c1-011d492c5382-kube-api-access-672sq\") pod \"watcher-db-create-5ll8g\" (UID: \"723627c3-a7ec-4694-80c1-011d492c5382\") " pod="watcher-kuttl-default/watcher-db-create-5ll8g" Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.714490 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-672sq\" (UniqueName: \"kubernetes.io/projected/723627c3-a7ec-4694-80c1-011d492c5382-kube-api-access-672sq\") pod \"watcher-db-create-5ll8g\" (UID: \"723627c3-a7ec-4694-80c1-011d492c5382\") " pod="watcher-kuttl-default/watcher-db-create-5ll8g" Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.733431 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-672sq\" (UniqueName: \"kubernetes.io/projected/723627c3-a7ec-4694-80c1-011d492c5382-kube-api-access-672sq\") pod \"watcher-db-create-5ll8g\" (UID: \"723627c3-a7ec-4694-80c1-011d492c5382\") " pod="watcher-kuttl-default/watcher-db-create-5ll8g" Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.747414 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d4ec48-89a3-456c-824e-dc627d6f7b8b" path="/var/lib/kubelet/pods/b3d4ec48-89a3-456c-824e-dc627d6f7b8b/volumes" Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.748294 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0acbc62-7836-4ea9-a0e4-7c58d7b41896" path="/var/lib/kubelet/pods/f0acbc62-7836-4ea9-a0e4-7c58d7b41896/volumes" Dec 15 14:24:38 crc kubenswrapper[4794]: I1215 14:24:38.843494 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5ll8g" Dec 15 14:24:39 crc kubenswrapper[4794]: I1215 14:24:39.291453 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5ll8g"] Dec 15 14:24:39 crc kubenswrapper[4794]: W1215 14:24:39.298193 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod723627c3_a7ec_4694_80c1_011d492c5382.slice/crio-8df4bc17219e5a5e4d347c1fad4329229d0904be6a6f2662d9832305c559be1e WatchSource:0}: Error finding container 8df4bc17219e5a5e4d347c1fad4329229d0904be6a6f2662d9832305c559be1e: Status 404 returned error can't find the container with id 8df4bc17219e5a5e4d347c1fad4329229d0904be6a6f2662d9832305c559be1e Dec 15 14:24:40 crc kubenswrapper[4794]: I1215 14:24:40.023193 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3b5a9e36-b184-41cc-855a-d8ef2933ceb3","Type":"ContainerStarted","Data":"d01208ddca4d7fcee7acd98aad25682b5dfc72f78f1782546e5d0b180c0de7b8"} Dec 15 14:24:40 crc kubenswrapper[4794]: I1215 14:24:40.024889 4794 generic.go:334] "Generic (PLEG): container finished" podID="723627c3-a7ec-4694-80c1-011d492c5382" containerID="0ea076a88d3f6e8e404686548685e5731f96cfdb2a6322e9aa999ab38c93ed7d" exitCode=0 Dec 15 14:24:40 crc kubenswrapper[4794]: I1215 14:24:40.024918 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-5ll8g" event={"ID":"723627c3-a7ec-4694-80c1-011d492c5382","Type":"ContainerDied","Data":"0ea076a88d3f6e8e404686548685e5731f96cfdb2a6322e9aa999ab38c93ed7d"} Dec 15 14:24:40 crc kubenswrapper[4794]: I1215 14:24:40.024934 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-5ll8g" event={"ID":"723627c3-a7ec-4694-80c1-011d492c5382","Type":"ContainerStarted","Data":"8df4bc17219e5a5e4d347c1fad4329229d0904be6a6f2662d9832305c559be1e"} Dec 15 14:24:41 crc kubenswrapper[4794]: I1215 14:24:41.035345 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3b5a9e36-b184-41cc-855a-d8ef2933ceb3","Type":"ContainerStarted","Data":"687f9e6507ddea4ddd6aa72377e92725015210b74b9a562aed806e48089883cc"} Dec 15 14:24:41 crc kubenswrapper[4794]: I1215 14:24:41.036007 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3b5a9e36-b184-41cc-855a-d8ef2933ceb3","Type":"ContainerStarted","Data":"b8a26501a9d5662ae0053ce4256f52754221ed2bcd678937e10df0de7f3eecde"} Dec 15 14:24:41 crc kubenswrapper[4794]: I1215 14:24:41.501438 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5ll8g" Dec 15 14:24:41 crc kubenswrapper[4794]: I1215 14:24:41.670763 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-672sq\" (UniqueName: \"kubernetes.io/projected/723627c3-a7ec-4694-80c1-011d492c5382-kube-api-access-672sq\") pod \"723627c3-a7ec-4694-80c1-011d492c5382\" (UID: \"723627c3-a7ec-4694-80c1-011d492c5382\") " Dec 15 14:24:41 crc kubenswrapper[4794]: I1215 14:24:41.686864 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723627c3-a7ec-4694-80c1-011d492c5382-kube-api-access-672sq" (OuterVolumeSpecName: "kube-api-access-672sq") pod "723627c3-a7ec-4694-80c1-011d492c5382" (UID: "723627c3-a7ec-4694-80c1-011d492c5382"). InnerVolumeSpecName "kube-api-access-672sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:41 crc kubenswrapper[4794]: I1215 14:24:41.773001 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-672sq\" (UniqueName: \"kubernetes.io/projected/723627c3-a7ec-4694-80c1-011d492c5382-kube-api-access-672sq\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:42 crc kubenswrapper[4794]: I1215 14:24:42.049817 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-5ll8g" event={"ID":"723627c3-a7ec-4694-80c1-011d492c5382","Type":"ContainerDied","Data":"8df4bc17219e5a5e4d347c1fad4329229d0904be6a6f2662d9832305c559be1e"} Dec 15 14:24:42 crc kubenswrapper[4794]: I1215 14:24:42.049895 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8df4bc17219e5a5e4d347c1fad4329229d0904be6a6f2662d9832305c559be1e" Dec 15 14:24:42 crc kubenswrapper[4794]: I1215 14:24:42.050180 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-5ll8g" Dec 15 14:24:43 crc kubenswrapper[4794]: I1215 14:24:43.059924 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3b5a9e36-b184-41cc-855a-d8ef2933ceb3","Type":"ContainerStarted","Data":"b12b84fb6abf54b39b5043b9bc111d0f9fbf8486765e5cd9f0f976fbbf2f9b81"} Dec 15 14:24:43 crc kubenswrapper[4794]: I1215 14:24:43.060273 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:24:43 crc kubenswrapper[4794]: I1215 14:24:43.084337 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.990737111 podStartE2EDuration="6.084318952s" podCreationTimestamp="2025-12-15 14:24:37 +0000 UTC" firstStartedPulling="2025-12-15 14:24:37.869191511 +0000 UTC m=+1839.721213949" lastFinishedPulling="2025-12-15 14:24:41.962773352 +0000 UTC m=+1843.814795790" observedRunningTime="2025-12-15 14:24:43.077940732 +0000 UTC m=+1844.929963170" watchObservedRunningTime="2025-12-15 14:24:43.084318952 +0000 UTC m=+1844.936341390" Dec 15 14:24:46 crc kubenswrapper[4794]: I1215 14:24:46.737945 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:24:46 crc kubenswrapper[4794]: E1215 14:24:46.738788 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:24:48 crc kubenswrapper[4794]: I1215 14:24:48.634284 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-b208-account-create-p5q4x"] Dec 15 14:24:48 crc kubenswrapper[4794]: E1215 14:24:48.634845 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723627c3-a7ec-4694-80c1-011d492c5382" containerName="mariadb-database-create" Dec 15 14:24:48 crc kubenswrapper[4794]: I1215 14:24:48.634856 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="723627c3-a7ec-4694-80c1-011d492c5382" containerName="mariadb-database-create" Dec 15 14:24:48 crc kubenswrapper[4794]: I1215 14:24:48.634993 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="723627c3-a7ec-4694-80c1-011d492c5382" containerName="mariadb-database-create" Dec 15 14:24:48 crc kubenswrapper[4794]: I1215 14:24:48.635503 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b208-account-create-p5q4x" Dec 15 14:24:48 crc kubenswrapper[4794]: I1215 14:24:48.637504 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 15 14:24:48 crc kubenswrapper[4794]: I1215 14:24:48.655044 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-b208-account-create-p5q4x"] Dec 15 14:24:48 crc kubenswrapper[4794]: I1215 14:24:48.686305 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpfrr\" (UniqueName: \"kubernetes.io/projected/c685a13f-0829-4054-b6e5-8dd6c4848668-kube-api-access-gpfrr\") pod \"watcher-b208-account-create-p5q4x\" (UID: \"c685a13f-0829-4054-b6e5-8dd6c4848668\") " pod="watcher-kuttl-default/watcher-b208-account-create-p5q4x" Dec 15 14:24:48 crc kubenswrapper[4794]: I1215 14:24:48.787020 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpfrr\" (UniqueName: \"kubernetes.io/projected/c685a13f-0829-4054-b6e5-8dd6c4848668-kube-api-access-gpfrr\") pod \"watcher-b208-account-create-p5q4x\" (UID: \"c685a13f-0829-4054-b6e5-8dd6c4848668\") " pod="watcher-kuttl-default/watcher-b208-account-create-p5q4x" Dec 15 14:24:48 crc kubenswrapper[4794]: I1215 14:24:48.818465 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpfrr\" (UniqueName: \"kubernetes.io/projected/c685a13f-0829-4054-b6e5-8dd6c4848668-kube-api-access-gpfrr\") pod \"watcher-b208-account-create-p5q4x\" (UID: \"c685a13f-0829-4054-b6e5-8dd6c4848668\") " pod="watcher-kuttl-default/watcher-b208-account-create-p5q4x" Dec 15 14:24:48 crc kubenswrapper[4794]: I1215 14:24:48.951659 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b208-account-create-p5q4x" Dec 15 14:24:49 crc kubenswrapper[4794]: I1215 14:24:49.389466 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-b208-account-create-p5q4x"] Dec 15 14:24:50 crc kubenswrapper[4794]: I1215 14:24:50.060198 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-kr9xn"] Dec 15 14:24:50 crc kubenswrapper[4794]: I1215 14:24:50.069900 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-kr9xn"] Dec 15 14:24:50 crc kubenswrapper[4794]: I1215 14:24:50.129249 4794 generic.go:334] "Generic (PLEG): container finished" podID="c685a13f-0829-4054-b6e5-8dd6c4848668" containerID="57eefbc6dd361dfdd7b6f564940323c6d04f3cb3ba5549ff8205f272b7d8a6d1" exitCode=0 Dec 15 14:24:50 crc kubenswrapper[4794]: I1215 14:24:50.129293 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-b208-account-create-p5q4x" event={"ID":"c685a13f-0829-4054-b6e5-8dd6c4848668","Type":"ContainerDied","Data":"57eefbc6dd361dfdd7b6f564940323c6d04f3cb3ba5549ff8205f272b7d8a6d1"} Dec 15 14:24:50 crc kubenswrapper[4794]: I1215 14:24:50.129321 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-b208-account-create-p5q4x" event={"ID":"c685a13f-0829-4054-b6e5-8dd6c4848668","Type":"ContainerStarted","Data":"306419b28692ad0d273935aac36425142d9cad0befcec0a1541f00cf4e33b404"} Dec 15 14:24:50 crc kubenswrapper[4794]: I1215 14:24:50.749919 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a1d66da-db28-48e2-82c9-bc3730fc1934" path="/var/lib/kubelet/pods/9a1d66da-db28-48e2-82c9-bc3730fc1934/volumes" Dec 15 14:24:51 crc kubenswrapper[4794]: I1215 14:24:51.507248 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b208-account-create-p5q4x" Dec 15 14:24:51 crc kubenswrapper[4794]: I1215 14:24:51.639387 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpfrr\" (UniqueName: \"kubernetes.io/projected/c685a13f-0829-4054-b6e5-8dd6c4848668-kube-api-access-gpfrr\") pod \"c685a13f-0829-4054-b6e5-8dd6c4848668\" (UID: \"c685a13f-0829-4054-b6e5-8dd6c4848668\") " Dec 15 14:24:51 crc kubenswrapper[4794]: I1215 14:24:51.644742 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c685a13f-0829-4054-b6e5-8dd6c4848668-kube-api-access-gpfrr" (OuterVolumeSpecName: "kube-api-access-gpfrr") pod "c685a13f-0829-4054-b6e5-8dd6c4848668" (UID: "c685a13f-0829-4054-b6e5-8dd6c4848668"). InnerVolumeSpecName "kube-api-access-gpfrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:51 crc kubenswrapper[4794]: I1215 14:24:51.740861 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpfrr\" (UniqueName: \"kubernetes.io/projected/c685a13f-0829-4054-b6e5-8dd6c4848668-kube-api-access-gpfrr\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:52 crc kubenswrapper[4794]: I1215 14:24:52.150034 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-b208-account-create-p5q4x" event={"ID":"c685a13f-0829-4054-b6e5-8dd6c4848668","Type":"ContainerDied","Data":"306419b28692ad0d273935aac36425142d9cad0befcec0a1541f00cf4e33b404"} Dec 15 14:24:52 crc kubenswrapper[4794]: I1215 14:24:52.150277 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306419b28692ad0d273935aac36425142d9cad0befcec0a1541f00cf4e33b404" Dec 15 14:24:52 crc kubenswrapper[4794]: I1215 14:24:52.150337 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-b208-account-create-p5q4x" Dec 15 14:24:53 crc kubenswrapper[4794]: I1215 14:24:53.912269 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-cflj5"] Dec 15 14:24:53 crc kubenswrapper[4794]: E1215 14:24:53.912886 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c685a13f-0829-4054-b6e5-8dd6c4848668" containerName="mariadb-account-create" Dec 15 14:24:53 crc kubenswrapper[4794]: I1215 14:24:53.912899 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="c685a13f-0829-4054-b6e5-8dd6c4848668" containerName="mariadb-account-create" Dec 15 14:24:53 crc kubenswrapper[4794]: I1215 14:24:53.913052 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="c685a13f-0829-4054-b6e5-8dd6c4848668" containerName="mariadb-account-create" Dec 15 14:24:53 crc kubenswrapper[4794]: I1215 14:24:53.913599 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:53 crc kubenswrapper[4794]: I1215 14:24:53.918369 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-cflj5"] Dec 15 14:24:53 crc kubenswrapper[4794]: I1215 14:24:53.918478 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-7k22n" Dec 15 14:24:53 crc kubenswrapper[4794]: I1215 14:24:53.919436 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.077832 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mx5j\" (UniqueName: \"kubernetes.io/projected/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-kube-api-access-8mx5j\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.077949 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-config-data\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.077984 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-db-sync-config-data\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.078016 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.179318 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-config-data\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.179373 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-db-sync-config-data\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.179407 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.179446 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mx5j\" (UniqueName: \"kubernetes.io/projected/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-kube-api-access-8mx5j\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.186556 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.197427 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-db-sync-config-data\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.197834 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-config-data\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.202748 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mx5j\" (UniqueName: \"kubernetes.io/projected/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-kube-api-access-8mx5j\") pod \"watcher-kuttl-db-sync-cflj5\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.247516 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:54 crc kubenswrapper[4794]: W1215 14:24:54.549659 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e227e8f_9d4c_49c3_81bc_c5c16caa695d.slice/crio-f3c7da09e14cfa1916a5c31d6a6697e800722250cf76d783f287e07e577d4511 WatchSource:0}: Error finding container f3c7da09e14cfa1916a5c31d6a6697e800722250cf76d783f287e07e577d4511: Status 404 returned error can't find the container with id f3c7da09e14cfa1916a5c31d6a6697e800722250cf76d783f287e07e577d4511 Dec 15 14:24:54 crc kubenswrapper[4794]: I1215 14:24:54.559310 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-cflj5"] Dec 15 14:24:55 crc kubenswrapper[4794]: I1215 14:24:55.183507 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" event={"ID":"7e227e8f-9d4c-49c3-81bc-c5c16caa695d","Type":"ContainerStarted","Data":"099e44b5f21d9d55d2f7b9319227c38c45c0cdf0f4f68cadafd245465d28aabc"} Dec 15 14:24:55 crc kubenswrapper[4794]: I1215 14:24:55.183857 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" event={"ID":"7e227e8f-9d4c-49c3-81bc-c5c16caa695d","Type":"ContainerStarted","Data":"f3c7da09e14cfa1916a5c31d6a6697e800722250cf76d783f287e07e577d4511"} Dec 15 14:24:55 crc kubenswrapper[4794]: I1215 14:24:55.203288 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" podStartSLOduration=2.203264002 podStartE2EDuration="2.203264002s" podCreationTimestamp="2025-12-15 14:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:24:55.195929074 +0000 UTC m=+1857.047951512" watchObservedRunningTime="2025-12-15 14:24:55.203264002 +0000 UTC m=+1857.055286450" Dec 15 14:24:58 crc kubenswrapper[4794]: I1215 14:24:58.213118 4794 generic.go:334] "Generic (PLEG): container finished" podID="7e227e8f-9d4c-49c3-81bc-c5c16caa695d" containerID="099e44b5f21d9d55d2f7b9319227c38c45c0cdf0f4f68cadafd245465d28aabc" exitCode=0 Dec 15 14:24:58 crc kubenswrapper[4794]: I1215 14:24:58.213197 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" event={"ID":"7e227e8f-9d4c-49c3-81bc-c5c16caa695d","Type":"ContainerDied","Data":"099e44b5f21d9d55d2f7b9319227c38c45c0cdf0f4f68cadafd245465d28aabc"} Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.549990 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.667891 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-db-sync-config-data\") pod \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.668261 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mx5j\" (UniqueName: \"kubernetes.io/projected/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-kube-api-access-8mx5j\") pod \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.668393 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-config-data\") pod \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.668530 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-combined-ca-bundle\") pod \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\" (UID: \"7e227e8f-9d4c-49c3-81bc-c5c16caa695d\") " Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.674418 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7e227e8f-9d4c-49c3-81bc-c5c16caa695d" (UID: "7e227e8f-9d4c-49c3-81bc-c5c16caa695d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.677632 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-kube-api-access-8mx5j" (OuterVolumeSpecName: "kube-api-access-8mx5j") pod "7e227e8f-9d4c-49c3-81bc-c5c16caa695d" (UID: "7e227e8f-9d4c-49c3-81bc-c5c16caa695d"). InnerVolumeSpecName "kube-api-access-8mx5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.702619 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e227e8f-9d4c-49c3-81bc-c5c16caa695d" (UID: "7e227e8f-9d4c-49c3-81bc-c5c16caa695d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.720839 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-config-data" (OuterVolumeSpecName: "config-data") pod "7e227e8f-9d4c-49c3-81bc-c5c16caa695d" (UID: "7e227e8f-9d4c-49c3-81bc-c5c16caa695d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.770284 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mx5j\" (UniqueName: \"kubernetes.io/projected/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-kube-api-access-8mx5j\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.770318 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.770329 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:24:59 crc kubenswrapper[4794]: I1215 14:24:59.770338 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e227e8f-9d4c-49c3-81bc-c5c16caa695d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.235730 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" event={"ID":"7e227e8f-9d4c-49c3-81bc-c5c16caa695d","Type":"ContainerDied","Data":"f3c7da09e14cfa1916a5c31d6a6697e800722250cf76d783f287e07e577d4511"} Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.235797 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3c7da09e14cfa1916a5c31d6a6697e800722250cf76d783f287e07e577d4511" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.235843 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-cflj5" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.496381 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:25:00 crc kubenswrapper[4794]: E1215 14:25:00.497070 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e227e8f-9d4c-49c3-81bc-c5c16caa695d" containerName="watcher-kuttl-db-sync" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.497147 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e227e8f-9d4c-49c3-81bc-c5c16caa695d" containerName="watcher-kuttl-db-sync" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.497395 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e227e8f-9d4c-49c3-81bc-c5c16caa695d" containerName="watcher-kuttl-db-sync" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.498384 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.507989 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-7k22n" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.511285 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.519941 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.535515 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.537075 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.539668 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.584100 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmmjm\" (UniqueName: \"kubernetes.io/projected/1ac6ba99-779b-42d6-8b40-6143b73e02c9-kube-api-access-kmmjm\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.584337 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.584553 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.585294 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.585459 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ac6ba99-779b-42d6-8b40-6143b73e02c9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.585490 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.602168 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.655380 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.657068 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.661753 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.664207 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689026 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mkkj\" (UniqueName: \"kubernetes.io/projected/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-kube-api-access-6mkkj\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689080 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689101 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689125 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689160 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689193 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ac6ba99-779b-42d6-8b40-6143b73e02c9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689211 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689248 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689272 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmmjm\" (UniqueName: \"kubernetes.io/projected/1ac6ba99-779b-42d6-8b40-6143b73e02c9-kube-api-access-kmmjm\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689310 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689336 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.689357 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.690042 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ac6ba99-779b-42d6-8b40-6143b73e02c9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.694969 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.695679 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.700426 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.707288 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.709938 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmmjm\" (UniqueName: \"kubernetes.io/projected/1ac6ba99-779b-42d6-8b40-6143b73e02c9-kube-api-access-kmmjm\") pod \"watcher-kuttl-api-0\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.741068 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.792567 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca2c196-d900-4878-8786-27133b208c91-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.792955 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.793389 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2tl9\" (UniqueName: \"kubernetes.io/projected/9ca2c196-d900-4878-8786-27133b208c91-kube-api-access-z2tl9\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.793458 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.793611 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.793707 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.795006 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mkkj\" (UniqueName: \"kubernetes.io/projected/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-kube-api-access-6mkkj\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.795055 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.795084 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.795123 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.795154 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.795207 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.805331 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.805455 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.806372 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.814191 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.815174 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.830766 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mkkj\" (UniqueName: \"kubernetes.io/projected/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-kube-api-access-6mkkj\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.881285 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.896674 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca2c196-d900-4878-8786-27133b208c91-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.897002 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2tl9\" (UniqueName: \"kubernetes.io/projected/9ca2c196-d900-4878-8786-27133b208c91-kube-api-access-z2tl9\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.897106 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.897240 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.897333 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.898533 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca2c196-d900-4878-8786-27133b208c91-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.906119 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.908427 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.919147 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.940205 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2tl9\" (UniqueName: \"kubernetes.io/projected/9ca2c196-d900-4878-8786-27133b208c91-kube-api-access-z2tl9\") pod \"watcher-kuttl-applier-0\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:00 crc kubenswrapper[4794]: I1215 14:25:00.984067 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:01 crc kubenswrapper[4794]: I1215 14:25:01.260482 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"37758683c163a83b27c0dcff30a182ab662ba60635fa8dc89b1899be6fa8e6c5"} Dec 15 14:25:01 crc kubenswrapper[4794]: I1215 14:25:01.577634 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:25:01 crc kubenswrapper[4794]: I1215 14:25:01.671661 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:25:01 crc kubenswrapper[4794]: W1215 14:25:01.676636 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ca2c196_d900_4878_8786_27133b208c91.slice/crio-04cd43e654408c5aaaf9fc259998c4c1b42a4b563560e4191b1cd58cf3b4f74b WatchSource:0}: Error finding container 04cd43e654408c5aaaf9fc259998c4c1b42a4b563560e4191b1cd58cf3b4f74b: Status 404 returned error can't find the container with id 04cd43e654408c5aaaf9fc259998c4c1b42a4b563560e4191b1cd58cf3b4f74b Dec 15 14:25:01 crc kubenswrapper[4794]: W1215 14:25:01.681759 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd49cd20c_9b89_4625_9bb5_cc8f88bc341d.slice/crio-8ca6ff1c667a6829bb765bfcde84e0b218320d9ce9b5a6187ee5c94db8a96087 WatchSource:0}: Error finding container 8ca6ff1c667a6829bb765bfcde84e0b218320d9ce9b5a6187ee5c94db8a96087: Status 404 returned error can't find the container with id 8ca6ff1c667a6829bb765bfcde84e0b218320d9ce9b5a6187ee5c94db8a96087 Dec 15 14:25:01 crc kubenswrapper[4794]: I1215 14:25:01.688193 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:25:02 crc kubenswrapper[4794]: I1215 14:25:02.269599 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1ac6ba99-779b-42d6-8b40-6143b73e02c9","Type":"ContainerStarted","Data":"d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8"} Dec 15 14:25:02 crc kubenswrapper[4794]: I1215 14:25:02.270208 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1ac6ba99-779b-42d6-8b40-6143b73e02c9","Type":"ContainerStarted","Data":"ea1a299f2fdfc30de4537735753353ebfdb097da6988565f5a62b05a40893f11"} Dec 15 14:25:02 crc kubenswrapper[4794]: I1215 14:25:02.272700 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"9ca2c196-d900-4878-8786-27133b208c91","Type":"ContainerStarted","Data":"42359eed6935d78072ff38ffc73692efba5b34f01f228cb069eec87ebef740ce"} Dec 15 14:25:02 crc kubenswrapper[4794]: I1215 14:25:02.272756 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"9ca2c196-d900-4878-8786-27133b208c91","Type":"ContainerStarted","Data":"04cd43e654408c5aaaf9fc259998c4c1b42a4b563560e4191b1cd58cf3b4f74b"} Dec 15 14:25:02 crc kubenswrapper[4794]: I1215 14:25:02.277066 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d49cd20c-9b89-4625-9bb5-cc8f88bc341d","Type":"ContainerStarted","Data":"0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff"} Dec 15 14:25:02 crc kubenswrapper[4794]: I1215 14:25:02.277111 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d49cd20c-9b89-4625-9bb5-cc8f88bc341d","Type":"ContainerStarted","Data":"8ca6ff1c667a6829bb765bfcde84e0b218320d9ce9b5a6187ee5c94db8a96087"} Dec 15 14:25:02 crc kubenswrapper[4794]: I1215 14:25:02.297919 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.297903923 podStartE2EDuration="2.297903923s" podCreationTimestamp="2025-12-15 14:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:25:02.292947622 +0000 UTC m=+1864.144970060" watchObservedRunningTime="2025-12-15 14:25:02.297903923 +0000 UTC m=+1864.149926361" Dec 15 14:25:02 crc kubenswrapper[4794]: I1215 14:25:02.317740 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.317719344 podStartE2EDuration="2.317719344s" podCreationTimestamp="2025-12-15 14:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:25:02.316064047 +0000 UTC m=+1864.168086485" watchObservedRunningTime="2025-12-15 14:25:02.317719344 +0000 UTC m=+1864.169741792" Dec 15 14:25:03 crc kubenswrapper[4794]: I1215 14:25:03.293918 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1ac6ba99-779b-42d6-8b40-6143b73e02c9","Type":"ContainerStarted","Data":"b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf"} Dec 15 14:25:03 crc kubenswrapper[4794]: I1215 14:25:03.321437 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=3.321404738 podStartE2EDuration="3.321404738s" podCreationTimestamp="2025-12-15 14:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:25:03.315985625 +0000 UTC m=+1865.168008073" watchObservedRunningTime="2025-12-15 14:25:03.321404738 +0000 UTC m=+1865.173427206" Dec 15 14:25:04 crc kubenswrapper[4794]: I1215 14:25:04.300911 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:05 crc kubenswrapper[4794]: I1215 14:25:05.815650 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:05 crc kubenswrapper[4794]: I1215 14:25:05.985295 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:06 crc kubenswrapper[4794]: I1215 14:25:06.315336 4794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 15 14:25:06 crc kubenswrapper[4794]: I1215 14:25:06.621891 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:07 crc kubenswrapper[4794]: I1215 14:25:07.450457 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:07 crc kubenswrapper[4794]: I1215 14:25:07.639354 4794 scope.go:117] "RemoveContainer" containerID="74627950df29d946c3be1a0887e6d8c4747f339f8a5f5c1a0947c460fc983af9" Dec 15 14:25:07 crc kubenswrapper[4794]: I1215 14:25:07.665824 4794 scope.go:117] "RemoveContainer" containerID="f255e61078f3a7dea1313ae49d02192dd17289a25bb7d23875ec3f362b4816f5" Dec 15 14:25:07 crc kubenswrapper[4794]: I1215 14:25:07.696604 4794 scope.go:117] "RemoveContainer" containerID="742dd0788cdf0e6b03e619a41ed224af76564bbac9ae89340c5eae0c36f821cf" Dec 15 14:25:07 crc kubenswrapper[4794]: I1215 14:25:07.754895 4794 scope.go:117] "RemoveContainer" containerID="b81824a9af090577ec689b82fae2bf93e6b502bcd6a63a3540b357ef464e1ce5" Dec 15 14:25:07 crc kubenswrapper[4794]: I1215 14:25:07.775195 4794 scope.go:117] "RemoveContainer" containerID="fd515d84d675aac17d39d17a5e36446635ab6395ad11067065d37e2706481757" Dec 15 14:25:07 crc kubenswrapper[4794]: I1215 14:25:07.828926 4794 scope.go:117] "RemoveContainer" containerID="b44791a5d9cbe4958c70ca8d1b8848ee4c6900899a1fe0369fc383a03fbd3ef9" Dec 15 14:25:07 crc kubenswrapper[4794]: I1215 14:25:07.846886 4794 scope.go:117] "RemoveContainer" containerID="2bf9b6dd86b8f0e7a17bbe02883181ece0e03d94815275a0f416efeb963f6f8f" Dec 15 14:25:10 crc kubenswrapper[4794]: I1215 14:25:10.815969 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:10 crc kubenswrapper[4794]: I1215 14:25:10.825255 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:10 crc kubenswrapper[4794]: I1215 14:25:10.882197 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:10 crc kubenswrapper[4794]: I1215 14:25:10.910646 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:10 crc kubenswrapper[4794]: I1215 14:25:10.985245 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:11 crc kubenswrapper[4794]: I1215 14:25:11.008890 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:11 crc kubenswrapper[4794]: I1215 14:25:11.378452 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:11 crc kubenswrapper[4794]: I1215 14:25:11.387496 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:11 crc kubenswrapper[4794]: I1215 14:25:11.411517 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:11 crc kubenswrapper[4794]: I1215 14:25:11.423898 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:12 crc kubenswrapper[4794]: I1215 14:25:12.570860 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:12 crc kubenswrapper[4794]: I1215 14:25:12.571180 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="ceilometer-central-agent" containerID="cri-o://d01208ddca4d7fcee7acd98aad25682b5dfc72f78f1782546e5d0b180c0de7b8" gracePeriod=30 Dec 15 14:25:12 crc kubenswrapper[4794]: I1215 14:25:12.571250 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="proxy-httpd" containerID="cri-o://b12b84fb6abf54b39b5043b9bc111d0f9fbf8486765e5cd9f0f976fbbf2f9b81" gracePeriod=30 Dec 15 14:25:12 crc kubenswrapper[4794]: I1215 14:25:12.571285 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="ceilometer-notification-agent" containerID="cri-o://b8a26501a9d5662ae0053ce4256f52754221ed2bcd678937e10df0de7f3eecde" gracePeriod=30 Dec 15 14:25:12 crc kubenswrapper[4794]: I1215 14:25:12.571297 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="sg-core" containerID="cri-o://687f9e6507ddea4ddd6aa72377e92725015210b74b9a562aed806e48089883cc" gracePeriod=30 Dec 15 14:25:13 crc kubenswrapper[4794]: I1215 14:25:13.398600 4794 generic.go:334] "Generic (PLEG): container finished" podID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerID="b12b84fb6abf54b39b5043b9bc111d0f9fbf8486765e5cd9f0f976fbbf2f9b81" exitCode=0 Dec 15 14:25:13 crc kubenswrapper[4794]: I1215 14:25:13.399171 4794 generic.go:334] "Generic (PLEG): container finished" podID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerID="687f9e6507ddea4ddd6aa72377e92725015210b74b9a562aed806e48089883cc" exitCode=2 Dec 15 14:25:13 crc kubenswrapper[4794]: I1215 14:25:13.399186 4794 generic.go:334] "Generic (PLEG): container finished" podID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerID="d01208ddca4d7fcee7acd98aad25682b5dfc72f78f1782546e5d0b180c0de7b8" exitCode=0 Dec 15 14:25:13 crc kubenswrapper[4794]: I1215 14:25:13.398678 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3b5a9e36-b184-41cc-855a-d8ef2933ceb3","Type":"ContainerDied","Data":"b12b84fb6abf54b39b5043b9bc111d0f9fbf8486765e5cd9f0f976fbbf2f9b81"} Dec 15 14:25:13 crc kubenswrapper[4794]: I1215 14:25:13.399297 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3b5a9e36-b184-41cc-855a-d8ef2933ceb3","Type":"ContainerDied","Data":"687f9e6507ddea4ddd6aa72377e92725015210b74b9a562aed806e48089883cc"} Dec 15 14:25:13 crc kubenswrapper[4794]: I1215 14:25:13.399313 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3b5a9e36-b184-41cc-855a-d8ef2933ceb3","Type":"ContainerDied","Data":"d01208ddca4d7fcee7acd98aad25682b5dfc72f78f1782546e5d0b180c0de7b8"} Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.451235 4794 generic.go:334] "Generic (PLEG): container finished" podID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerID="b8a26501a9d5662ae0053ce4256f52754221ed2bcd678937e10df0de7f3eecde" exitCode=0 Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.451319 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3b5a9e36-b184-41cc-855a-d8ef2933ceb3","Type":"ContainerDied","Data":"b8a26501a9d5662ae0053ce4256f52754221ed2bcd678937e10df0de7f3eecde"} Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.601865 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.723437 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjhgl\" (UniqueName: \"kubernetes.io/projected/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-kube-api-access-vjhgl\") pod \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.723574 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-run-httpd\") pod \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.723720 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-scripts\") pod \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.723790 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-sg-core-conf-yaml\") pod \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.723816 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-config-data\") pod \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.723889 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-ceilometer-tls-certs\") pod \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.723967 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-log-httpd\") pod \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.724015 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-combined-ca-bundle\") pod \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\" (UID: \"3b5a9e36-b184-41cc-855a-d8ef2933ceb3\") " Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.724105 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3b5a9e36-b184-41cc-855a-d8ef2933ceb3" (UID: "3b5a9e36-b184-41cc-855a-d8ef2933ceb3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.724435 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3b5a9e36-b184-41cc-855a-d8ef2933ceb3" (UID: "3b5a9e36-b184-41cc-855a-d8ef2933ceb3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.724538 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.724560 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.740653 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-kube-api-access-vjhgl" (OuterVolumeSpecName: "kube-api-access-vjhgl") pod "3b5a9e36-b184-41cc-855a-d8ef2933ceb3" (UID: "3b5a9e36-b184-41cc-855a-d8ef2933ceb3"). InnerVolumeSpecName "kube-api-access-vjhgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.758718 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-scripts" (OuterVolumeSpecName: "scripts") pod "3b5a9e36-b184-41cc-855a-d8ef2933ceb3" (UID: "3b5a9e36-b184-41cc-855a-d8ef2933ceb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.799717 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3b5a9e36-b184-41cc-855a-d8ef2933ceb3" (UID: "3b5a9e36-b184-41cc-855a-d8ef2933ceb3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.827572 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjhgl\" (UniqueName: \"kubernetes.io/projected/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-kube-api-access-vjhgl\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.827614 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.827624 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.878423 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3b5a9e36-b184-41cc-855a-d8ef2933ceb3" (UID: "3b5a9e36-b184-41cc-855a-d8ef2933ceb3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.924602 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-config-data" (OuterVolumeSpecName: "config-data") pod "3b5a9e36-b184-41cc-855a-d8ef2933ceb3" (UID: "3b5a9e36-b184-41cc-855a-d8ef2933ceb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.926727 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b5a9e36-b184-41cc-855a-d8ef2933ceb3" (UID: "3b5a9e36-b184-41cc-855a-d8ef2933ceb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.928679 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.928722 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:17 crc kubenswrapper[4794]: I1215 14:25:17.928734 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5a9e36-b184-41cc-855a-d8ef2933ceb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.462554 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"3b5a9e36-b184-41cc-855a-d8ef2933ceb3","Type":"ContainerDied","Data":"49c31adc61986a31fefb6b79a924693a4f7eef6ccba1bfbf69687ca58a411723"} Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.462622 4794 scope.go:117] "RemoveContainer" containerID="b12b84fb6abf54b39b5043b9bc111d0f9fbf8486765e5cd9f0f976fbbf2f9b81" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.462771 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.487068 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-cflj5"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.496699 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-cflj5"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.499524 4794 scope.go:117] "RemoveContainer" containerID="687f9e6507ddea4ddd6aa72377e92725015210b74b9a562aed806e48089883cc" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.527569 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.528749 4794 scope.go:117] "RemoveContainer" containerID="b8a26501a9d5662ae0053ce4256f52754221ed2bcd678937e10df0de7f3eecde" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.543655 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.564775 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:18 crc kubenswrapper[4794]: E1215 14:25:18.565149 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="ceilometer-notification-agent" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.565169 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="ceilometer-notification-agent" Dec 15 14:25:18 crc kubenswrapper[4794]: E1215 14:25:18.565197 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="ceilometer-central-agent" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.565204 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="ceilometer-central-agent" Dec 15 14:25:18 crc kubenswrapper[4794]: E1215 14:25:18.565214 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="proxy-httpd" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.565221 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="proxy-httpd" Dec 15 14:25:18 crc kubenswrapper[4794]: E1215 14:25:18.565236 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="sg-core" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.565244 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="sg-core" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.565390 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="ceilometer-central-agent" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.565405 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="sg-core" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.565418 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="ceilometer-notification-agent" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.565427 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" containerName="proxy-httpd" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.567125 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.574630 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.574876 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="d49cd20c-9b89-4625-9bb5-cc8f88bc341d" containerName="watcher-decision-engine" containerID="cri-o://0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff" gracePeriod=30 Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.583627 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcherb208-account-delete-mmwfm"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.584865 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb208-account-delete-mmwfm" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.589220 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.600134 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.600328 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.600456 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.600506 4794 scope.go:117] "RemoveContainer" containerID="d01208ddca4d7fcee7acd98aad25682b5dfc72f78f1782546e5d0b180c0de7b8" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.622643 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcherb208-account-delete-mmwfm"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.639798 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.639863 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-scripts\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.639912 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.639936 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.639987 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-config-data\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.640025 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-run-httpd\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.640045 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr55v\" (UniqueName: \"kubernetes.io/projected/44f5b0b8-81b3-4c18-9054-e0e3b0eacddb-kube-api-access-mr55v\") pod \"watcherb208-account-delete-mmwfm\" (UID: \"44f5b0b8-81b3-4c18-9054-e0e3b0eacddb\") " pod="watcher-kuttl-default/watcherb208-account-delete-mmwfm" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.640091 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-log-httpd\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.640140 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsrhb\" (UniqueName: \"kubernetes.io/projected/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-kube-api-access-jsrhb\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.701167 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.701450 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1ac6ba99-779b-42d6-8b40-6143b73e02c9" containerName="watcher-kuttl-api-log" containerID="cri-o://d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8" gracePeriod=30 Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.701852 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1ac6ba99-779b-42d6-8b40-6143b73e02c9" containerName="watcher-api" containerID="cri-o://b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf" gracePeriod=30 Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.741600 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.741656 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.741686 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-config-data\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.741737 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-run-httpd\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.741775 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr55v\" (UniqueName: \"kubernetes.io/projected/44f5b0b8-81b3-4c18-9054-e0e3b0eacddb-kube-api-access-mr55v\") pod \"watcherb208-account-delete-mmwfm\" (UID: \"44f5b0b8-81b3-4c18-9054-e0e3b0eacddb\") " pod="watcher-kuttl-default/watcherb208-account-delete-mmwfm" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.741814 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-log-httpd\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.741859 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsrhb\" (UniqueName: \"kubernetes.io/projected/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-kube-api-access-jsrhb\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.741886 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.741911 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-scripts\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.744364 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-run-httpd\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.747641 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-scripts\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.748158 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-log-httpd\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.749501 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.750366 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.754426 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-config-data\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.758476 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.761694 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b5a9e36-b184-41cc-855a-d8ef2933ceb3" path="/var/lib/kubelet/pods/3b5a9e36-b184-41cc-855a-d8ef2933ceb3/volumes" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.762500 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e227e8f-9d4c-49c3-81bc-c5c16caa695d" path="/var/lib/kubelet/pods/7e227e8f-9d4c-49c3-81bc-c5c16caa695d/volumes" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.763071 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5ll8g"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.763103 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-5ll8g"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.766711 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsrhb\" (UniqueName: \"kubernetes.io/projected/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-kube-api-access-jsrhb\") pod \"ceilometer-0\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.771676 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-b208-account-create-p5q4x"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.782660 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-b208-account-create-p5q4x"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.789206 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcherb208-account-delete-mmwfm"] Dec 15 14:25:18 crc kubenswrapper[4794]: E1215 14:25:18.789906 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-mr55v], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watcherb208-account-delete-mmwfm" podUID="44f5b0b8-81b3-4c18-9054-e0e3b0eacddb" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.792078 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr55v\" (UniqueName: \"kubernetes.io/projected/44f5b0b8-81b3-4c18-9054-e0e3b0eacddb-kube-api-access-mr55v\") pod \"watcherb208-account-delete-mmwfm\" (UID: \"44f5b0b8-81b3-4c18-9054-e0e3b0eacddb\") " pod="watcher-kuttl-default/watcherb208-account-delete-mmwfm" Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.796868 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.797123 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="9ca2c196-d900-4878-8786-27133b208c91" containerName="watcher-applier" containerID="cri-o://42359eed6935d78072ff38ffc73692efba5b34f01f228cb069eec87ebef740ce" gracePeriod=30 Dec 15 14:25:18 crc kubenswrapper[4794]: I1215 14:25:18.926158 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.006138 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-nxgrl"] Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.007507 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-nxgrl" Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.013190 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-nxgrl"] Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.162455 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jw58\" (UniqueName: \"kubernetes.io/projected/43a4772e-0b90-4c8f-9f25-a8aa83ea34d7-kube-api-access-8jw58\") pod \"watcher-db-create-nxgrl\" (UID: \"43a4772e-0b90-4c8f-9f25-a8aa83ea34d7\") " pod="watcher-kuttl-default/watcher-db-create-nxgrl" Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.263378 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jw58\" (UniqueName: \"kubernetes.io/projected/43a4772e-0b90-4c8f-9f25-a8aa83ea34d7-kube-api-access-8jw58\") pod \"watcher-db-create-nxgrl\" (UID: \"43a4772e-0b90-4c8f-9f25-a8aa83ea34d7\") " pod="watcher-kuttl-default/watcher-db-create-nxgrl" Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.289030 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jw58\" (UniqueName: \"kubernetes.io/projected/43a4772e-0b90-4c8f-9f25-a8aa83ea34d7-kube-api-access-8jw58\") pod \"watcher-db-create-nxgrl\" (UID: \"43a4772e-0b90-4c8f-9f25-a8aa83ea34d7\") " pod="watcher-kuttl-default/watcher-db-create-nxgrl" Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.364441 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-nxgrl" Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.482120 4794 generic.go:334] "Generic (PLEG): container finished" podID="1ac6ba99-779b-42d6-8b40-6143b73e02c9" containerID="d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8" exitCode=143 Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.482231 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb208-account-delete-mmwfm" Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.482232 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1ac6ba99-779b-42d6-8b40-6143b73e02c9","Type":"ContainerDied","Data":"d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8"} Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.505404 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.513066 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb208-account-delete-mmwfm" Dec 15 14:25:19 crc kubenswrapper[4794]: W1215 14:25:19.518599 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf7ceb0b_2c4a_4b0a_890d_9b8f3e70bf77.slice/crio-bff53d29d3c7d74d3f0cc1f3a03add8b559e46f79ccc61a8a17ba597eb906b8a WatchSource:0}: Error finding container bff53d29d3c7d74d3f0cc1f3a03add8b559e46f79ccc61a8a17ba597eb906b8a: Status 404 returned error can't find the container with id bff53d29d3c7d74d3f0cc1f3a03add8b559e46f79ccc61a8a17ba597eb906b8a Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.651153 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-nxgrl"] Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.670145 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr55v\" (UniqueName: \"kubernetes.io/projected/44f5b0b8-81b3-4c18-9054-e0e3b0eacddb-kube-api-access-mr55v\") pod \"44f5b0b8-81b3-4c18-9054-e0e3b0eacddb\" (UID: \"44f5b0b8-81b3-4c18-9054-e0e3b0eacddb\") " Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.674219 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f5b0b8-81b3-4c18-9054-e0e3b0eacddb-kube-api-access-mr55v" (OuterVolumeSpecName: "kube-api-access-mr55v") pod "44f5b0b8-81b3-4c18-9054-e0e3b0eacddb" (UID: "44f5b0b8-81b3-4c18-9054-e0e3b0eacddb"). InnerVolumeSpecName "kube-api-access-mr55v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:25:19 crc kubenswrapper[4794]: I1215 14:25:19.772163 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr55v\" (UniqueName: \"kubernetes.io/projected/44f5b0b8-81b3-4c18-9054-e0e3b0eacddb-kube-api-access-mr55v\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.136716 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.180399 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ac6ba99-779b-42d6-8b40-6143b73e02c9-logs\") pod \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.180446 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmmjm\" (UniqueName: \"kubernetes.io/projected/1ac6ba99-779b-42d6-8b40-6143b73e02c9-kube-api-access-kmmjm\") pod \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.180498 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-combined-ca-bundle\") pod \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.180549 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-config-data\") pod \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.180608 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-custom-prometheus-ca\") pod \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.180629 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-cert-memcached-mtls\") pod \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\" (UID: \"1ac6ba99-779b-42d6-8b40-6143b73e02c9\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.185996 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ac6ba99-779b-42d6-8b40-6143b73e02c9-logs" (OuterVolumeSpecName: "logs") pod "1ac6ba99-779b-42d6-8b40-6143b73e02c9" (UID: "1ac6ba99-779b-42d6-8b40-6143b73e02c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.188048 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac6ba99-779b-42d6-8b40-6143b73e02c9-kube-api-access-kmmjm" (OuterVolumeSpecName: "kube-api-access-kmmjm") pod "1ac6ba99-779b-42d6-8b40-6143b73e02c9" (UID: "1ac6ba99-779b-42d6-8b40-6143b73e02c9"). InnerVolumeSpecName "kube-api-access-kmmjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.210200 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.213117 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "1ac6ba99-779b-42d6-8b40-6143b73e02c9" (UID: "1ac6ba99-779b-42d6-8b40-6143b73e02c9"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.247565 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ac6ba99-779b-42d6-8b40-6143b73e02c9" (UID: "1ac6ba99-779b-42d6-8b40-6143b73e02c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.281747 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "1ac6ba99-779b-42d6-8b40-6143b73e02c9" (UID: "1ac6ba99-779b-42d6-8b40-6143b73e02c9"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.281888 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-combined-ca-bundle\") pod \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.281940 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-cert-memcached-mtls\") pod \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.281970 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-custom-prometheus-ca\") pod \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.281994 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-config-data\") pod \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.282051 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mkkj\" (UniqueName: \"kubernetes.io/projected/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-kube-api-access-6mkkj\") pod \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.282139 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-logs\") pod \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\" (UID: \"d49cd20c-9b89-4625-9bb5-cc8f88bc341d\") " Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.282996 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-logs" (OuterVolumeSpecName: "logs") pod "d49cd20c-9b89-4625-9bb5-cc8f88bc341d" (UID: "d49cd20c-9b89-4625-9bb5-cc8f88bc341d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.283197 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.283211 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.283225 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.283237 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.283247 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ac6ba99-779b-42d6-8b40-6143b73e02c9-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.283260 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmmjm\" (UniqueName: \"kubernetes.io/projected/1ac6ba99-779b-42d6-8b40-6143b73e02c9-kube-api-access-kmmjm\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.290256 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-config-data" (OuterVolumeSpecName: "config-data") pod "1ac6ba99-779b-42d6-8b40-6143b73e02c9" (UID: "1ac6ba99-779b-42d6-8b40-6143b73e02c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.290272 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-kube-api-access-6mkkj" (OuterVolumeSpecName: "kube-api-access-6mkkj") pod "d49cd20c-9b89-4625-9bb5-cc8f88bc341d" (UID: "d49cd20c-9b89-4625-9bb5-cc8f88bc341d"). InnerVolumeSpecName "kube-api-access-6mkkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.306415 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d49cd20c-9b89-4625-9bb5-cc8f88bc341d" (UID: "d49cd20c-9b89-4625-9bb5-cc8f88bc341d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.307429 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "d49cd20c-9b89-4625-9bb5-cc8f88bc341d" (UID: "d49cd20c-9b89-4625-9bb5-cc8f88bc341d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.345469 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "d49cd20c-9b89-4625-9bb5-cc8f88bc341d" (UID: "d49cd20c-9b89-4625-9bb5-cc8f88bc341d"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.349258 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-config-data" (OuterVolumeSpecName: "config-data") pod "d49cd20c-9b89-4625-9bb5-cc8f88bc341d" (UID: "d49cd20c-9b89-4625-9bb5-cc8f88bc341d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.384747 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.384785 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.384798 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.384808 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.384818 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mkkj\" (UniqueName: \"kubernetes.io/projected/d49cd20c-9b89-4625-9bb5-cc8f88bc341d-kube-api-access-6mkkj\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.384829 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac6ba99-779b-42d6-8b40-6143b73e02c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.491500 4794 generic.go:334] "Generic (PLEG): container finished" podID="43a4772e-0b90-4c8f-9f25-a8aa83ea34d7" containerID="65847b0b92bda8651c4e95e0630bf0fc8e348ad0ffb19f3759c637bcfd746711" exitCode=0 Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.491966 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-nxgrl" event={"ID":"43a4772e-0b90-4c8f-9f25-a8aa83ea34d7","Type":"ContainerDied","Data":"65847b0b92bda8651c4e95e0630bf0fc8e348ad0ffb19f3759c637bcfd746711"} Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.491996 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-nxgrl" event={"ID":"43a4772e-0b90-4c8f-9f25-a8aa83ea34d7","Type":"ContainerStarted","Data":"33f0e871d2e027a189ebd2c95ef07e71641084c410eb990e5d667b6d38d4808b"} Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.494236 4794 generic.go:334] "Generic (PLEG): container finished" podID="1ac6ba99-779b-42d6-8b40-6143b73e02c9" containerID="b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf" exitCode=0 Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.494313 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1ac6ba99-779b-42d6-8b40-6143b73e02c9","Type":"ContainerDied","Data":"b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf"} Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.494340 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1ac6ba99-779b-42d6-8b40-6143b73e02c9","Type":"ContainerDied","Data":"ea1a299f2fdfc30de4537735753353ebfdb097da6988565f5a62b05a40893f11"} Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.494380 4794 scope.go:117] "RemoveContainer" containerID="b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.494509 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.497127 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77","Type":"ContainerStarted","Data":"a92b5efd1bb8370db0e4722bdcbbbf74e823704ada73990a656280dece5992f8"} Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.497167 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77","Type":"ContainerStarted","Data":"bff53d29d3c7d74d3f0cc1f3a03add8b559e46f79ccc61a8a17ba597eb906b8a"} Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.499090 4794 generic.go:334] "Generic (PLEG): container finished" podID="d49cd20c-9b89-4625-9bb5-cc8f88bc341d" containerID="0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff" exitCode=0 Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.499153 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherb208-account-delete-mmwfm" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.499711 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.500215 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d49cd20c-9b89-4625-9bb5-cc8f88bc341d","Type":"ContainerDied","Data":"0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff"} Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.500270 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"d49cd20c-9b89-4625-9bb5-cc8f88bc341d","Type":"ContainerDied","Data":"8ca6ff1c667a6829bb765bfcde84e0b218320d9ce9b5a6187ee5c94db8a96087"} Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.517500 4794 scope.go:117] "RemoveContainer" containerID="d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.538989 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.539654 4794 scope.go:117] "RemoveContainer" containerID="b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf" Dec 15 14:25:20 crc kubenswrapper[4794]: E1215 14:25:20.540140 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf\": container with ID starting with b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf not found: ID does not exist" containerID="b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.540183 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf"} err="failed to get container status \"b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf\": rpc error: code = NotFound desc = could not find container \"b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf\": container with ID starting with b268500e3dda4186a58baff2ac246f6199c3d6459b9e3a7b42c31eeb3f429edf not found: ID does not exist" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.540207 4794 scope.go:117] "RemoveContainer" containerID="d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8" Dec 15 14:25:20 crc kubenswrapper[4794]: E1215 14:25:20.540532 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8\": container with ID starting with d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8 not found: ID does not exist" containerID="d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.540558 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8"} err="failed to get container status \"d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8\": rpc error: code = NotFound desc = could not find container \"d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8\": container with ID starting with d046b658b4e880e6f91c252027880d9c3be196ece960cd316731c2d6a69363d8 not found: ID does not exist" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.540575 4794 scope.go:117] "RemoveContainer" containerID="0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.545679 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.561846 4794 scope.go:117] "RemoveContainer" containerID="0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff" Dec 15 14:25:20 crc kubenswrapper[4794]: E1215 14:25:20.562846 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff\": container with ID starting with 0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff not found: ID does not exist" containerID="0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.562884 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff"} err="failed to get container status \"0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff\": rpc error: code = NotFound desc = could not find container \"0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff\": container with ID starting with 0c98017c0d5088c3105b0fa35816757d387de6eeef2804d41053c7a271727cff not found: ID does not exist" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.579852 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcherb208-account-delete-mmwfm"] Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.587135 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcherb208-account-delete-mmwfm"] Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.594245 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.606269 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.746803 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac6ba99-779b-42d6-8b40-6143b73e02c9" path="/var/lib/kubelet/pods/1ac6ba99-779b-42d6-8b40-6143b73e02c9/volumes" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.747356 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f5b0b8-81b3-4c18-9054-e0e3b0eacddb" path="/var/lib/kubelet/pods/44f5b0b8-81b3-4c18-9054-e0e3b0eacddb/volumes" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.747717 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="723627c3-a7ec-4694-80c1-011d492c5382" path="/var/lib/kubelet/pods/723627c3-a7ec-4694-80c1-011d492c5382/volumes" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.748206 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c685a13f-0829-4054-b6e5-8dd6c4848668" path="/var/lib/kubelet/pods/c685a13f-0829-4054-b6e5-8dd6c4848668/volumes" Dec 15 14:25:20 crc kubenswrapper[4794]: I1215 14:25:20.749266 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d49cd20c-9b89-4625-9bb5-cc8f88bc341d" path="/var/lib/kubelet/pods/d49cd20c-9b89-4625-9bb5-cc8f88bc341d/volumes" Dec 15 14:25:20 crc kubenswrapper[4794]: E1215 14:25:20.987958 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42359eed6935d78072ff38ffc73692efba5b34f01f228cb069eec87ebef740ce" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:25:20 crc kubenswrapper[4794]: E1215 14:25:20.989500 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42359eed6935d78072ff38ffc73692efba5b34f01f228cb069eec87ebef740ce" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:25:20 crc kubenswrapper[4794]: E1215 14:25:20.990936 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42359eed6935d78072ff38ffc73692efba5b34f01f228cb069eec87ebef740ce" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:25:20 crc kubenswrapper[4794]: E1215 14:25:20.990967 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="9ca2c196-d900-4878-8786-27133b208c91" containerName="watcher-applier" Dec 15 14:25:21 crc kubenswrapper[4794]: I1215 14:25:21.878148 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-nxgrl" Dec 15 14:25:22 crc kubenswrapper[4794]: I1215 14:25:22.010409 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jw58\" (UniqueName: \"kubernetes.io/projected/43a4772e-0b90-4c8f-9f25-a8aa83ea34d7-kube-api-access-8jw58\") pod \"43a4772e-0b90-4c8f-9f25-a8aa83ea34d7\" (UID: \"43a4772e-0b90-4c8f-9f25-a8aa83ea34d7\") " Dec 15 14:25:22 crc kubenswrapper[4794]: I1215 14:25:22.021786 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a4772e-0b90-4c8f-9f25-a8aa83ea34d7-kube-api-access-8jw58" (OuterVolumeSpecName: "kube-api-access-8jw58") pod "43a4772e-0b90-4c8f-9f25-a8aa83ea34d7" (UID: "43a4772e-0b90-4c8f-9f25-a8aa83ea34d7"). InnerVolumeSpecName "kube-api-access-8jw58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:25:22 crc kubenswrapper[4794]: I1215 14:25:22.112480 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jw58\" (UniqueName: \"kubernetes.io/projected/43a4772e-0b90-4c8f-9f25-a8aa83ea34d7-kube-api-access-8jw58\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:22 crc kubenswrapper[4794]: I1215 14:25:22.123434 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:22 crc kubenswrapper[4794]: I1215 14:25:22.563640 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77","Type":"ContainerStarted","Data":"2fe35d24bc1d3d27af9a4655fdabb66456c49a9ceb2490a23a1ba0780efb881d"} Dec 15 14:25:22 crc kubenswrapper[4794]: I1215 14:25:22.565385 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-nxgrl" event={"ID":"43a4772e-0b90-4c8f-9f25-a8aa83ea34d7","Type":"ContainerDied","Data":"33f0e871d2e027a189ebd2c95ef07e71641084c410eb990e5d667b6d38d4808b"} Dec 15 14:25:22 crc kubenswrapper[4794]: I1215 14:25:22.565418 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33f0e871d2e027a189ebd2c95ef07e71641084c410eb990e5d667b6d38d4808b" Dec 15 14:25:22 crc kubenswrapper[4794]: I1215 14:25:22.565474 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-nxgrl" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.583384 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77","Type":"ContainerStarted","Data":"3258bbf8f0c3c481c2d97e008b40c4043c516ff1be6c3edde58fe72403b7bd5a"} Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.585054 4794 generic.go:334] "Generic (PLEG): container finished" podID="9ca2c196-d900-4878-8786-27133b208c91" containerID="42359eed6935d78072ff38ffc73692efba5b34f01f228cb069eec87ebef740ce" exitCode=0 Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.585098 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"9ca2c196-d900-4878-8786-27133b208c91","Type":"ContainerDied","Data":"42359eed6935d78072ff38ffc73692efba5b34f01f228cb069eec87ebef740ce"} Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.585126 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"9ca2c196-d900-4878-8786-27133b208c91","Type":"ContainerDied","Data":"04cd43e654408c5aaaf9fc259998c4c1b42a4b563560e4191b1cd58cf3b4f74b"} Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.585137 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04cd43e654408c5aaaf9fc259998c4c1b42a4b563560e4191b1cd58cf3b4f74b" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.660048 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.837541 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-config-data\") pod \"9ca2c196-d900-4878-8786-27133b208c91\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.837631 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-combined-ca-bundle\") pod \"9ca2c196-d900-4878-8786-27133b208c91\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.837683 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-cert-memcached-mtls\") pod \"9ca2c196-d900-4878-8786-27133b208c91\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.837709 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca2c196-d900-4878-8786-27133b208c91-logs\") pod \"9ca2c196-d900-4878-8786-27133b208c91\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.837730 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2tl9\" (UniqueName: \"kubernetes.io/projected/9ca2c196-d900-4878-8786-27133b208c91-kube-api-access-z2tl9\") pod \"9ca2c196-d900-4878-8786-27133b208c91\" (UID: \"9ca2c196-d900-4878-8786-27133b208c91\") " Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.838137 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ca2c196-d900-4878-8786-27133b208c91-logs" (OuterVolumeSpecName: "logs") pod "9ca2c196-d900-4878-8786-27133b208c91" (UID: "9ca2c196-d900-4878-8786-27133b208c91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.839205 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca2c196-d900-4878-8786-27133b208c91-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.841731 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca2c196-d900-4878-8786-27133b208c91-kube-api-access-z2tl9" (OuterVolumeSpecName: "kube-api-access-z2tl9") pod "9ca2c196-d900-4878-8786-27133b208c91" (UID: "9ca2c196-d900-4878-8786-27133b208c91"). InnerVolumeSpecName "kube-api-access-z2tl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.864164 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ca2c196-d900-4878-8786-27133b208c91" (UID: "9ca2c196-d900-4878-8786-27133b208c91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.881660 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-config-data" (OuterVolumeSpecName: "config-data") pod "9ca2c196-d900-4878-8786-27133b208c91" (UID: "9ca2c196-d900-4878-8786-27133b208c91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.901755 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "9ca2c196-d900-4878-8786-27133b208c91" (UID: "9ca2c196-d900-4878-8786-27133b208c91"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.940281 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.940321 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.940335 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2tl9\" (UniqueName: \"kubernetes.io/projected/9ca2c196-d900-4878-8786-27133b208c91-kube-api-access-z2tl9\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:23 crc kubenswrapper[4794]: I1215 14:25:23.940353 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca2c196-d900-4878-8786-27133b208c91-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:24 crc kubenswrapper[4794]: I1215 14:25:24.593067 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:24 crc kubenswrapper[4794]: I1215 14:25:24.629331 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:25:24 crc kubenswrapper[4794]: I1215 14:25:24.638002 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:25:24 crc kubenswrapper[4794]: I1215 14:25:24.749163 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ca2c196-d900-4878-8786-27133b208c91" path="/var/lib/kubelet/pods/9ca2c196-d900-4878-8786-27133b208c91/volumes" Dec 15 14:25:25 crc kubenswrapper[4794]: I1215 14:25:25.602817 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77","Type":"ContainerStarted","Data":"880605b1cba82b2ff72c88640026e2670e6e3776a9a7bd5d7669f6cd2ea1ed1d"} Dec 15 14:25:25 crc kubenswrapper[4794]: I1215 14:25:25.602960 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="ceilometer-central-agent" containerID="cri-o://a92b5efd1bb8370db0e4722bdcbbbf74e823704ada73990a656280dece5992f8" gracePeriod=30 Dec 15 14:25:25 crc kubenswrapper[4794]: I1215 14:25:25.602991 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="proxy-httpd" containerID="cri-o://880605b1cba82b2ff72c88640026e2670e6e3776a9a7bd5d7669f6cd2ea1ed1d" gracePeriod=30 Dec 15 14:25:25 crc kubenswrapper[4794]: I1215 14:25:25.603051 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="sg-core" containerID="cri-o://3258bbf8f0c3c481c2d97e008b40c4043c516ff1be6c3edde58fe72403b7bd5a" gracePeriod=30 Dec 15 14:25:25 crc kubenswrapper[4794]: I1215 14:25:25.603093 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="ceilometer-notification-agent" containerID="cri-o://2fe35d24bc1d3d27af9a4655fdabb66456c49a9ceb2490a23a1ba0780efb881d" gracePeriod=30 Dec 15 14:25:25 crc kubenswrapper[4794]: I1215 14:25:25.603233 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:25 crc kubenswrapper[4794]: I1215 14:25:25.629985 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.033536805 podStartE2EDuration="7.629967189s" podCreationTimestamp="2025-12-15 14:25:18 +0000 UTC" firstStartedPulling="2025-12-15 14:25:19.528022622 +0000 UTC m=+1881.380045070" lastFinishedPulling="2025-12-15 14:25:25.124453016 +0000 UTC m=+1886.976475454" observedRunningTime="2025-12-15 14:25:25.628701703 +0000 UTC m=+1887.480724161" watchObservedRunningTime="2025-12-15 14:25:25.629967189 +0000 UTC m=+1887.481989627" Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.616860 4794 generic.go:334] "Generic (PLEG): container finished" podID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerID="880605b1cba82b2ff72c88640026e2670e6e3776a9a7bd5d7669f6cd2ea1ed1d" exitCode=0 Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.617199 4794 generic.go:334] "Generic (PLEG): container finished" podID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerID="3258bbf8f0c3c481c2d97e008b40c4043c516ff1be6c3edde58fe72403b7bd5a" exitCode=2 Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.617209 4794 generic.go:334] "Generic (PLEG): container finished" podID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerID="2fe35d24bc1d3d27af9a4655fdabb66456c49a9ceb2490a23a1ba0780efb881d" exitCode=0 Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.617217 4794 generic.go:334] "Generic (PLEG): container finished" podID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerID="a92b5efd1bb8370db0e4722bdcbbbf74e823704ada73990a656280dece5992f8" exitCode=0 Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.616908 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77","Type":"ContainerDied","Data":"880605b1cba82b2ff72c88640026e2670e6e3776a9a7bd5d7669f6cd2ea1ed1d"} Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.617246 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77","Type":"ContainerDied","Data":"3258bbf8f0c3c481c2d97e008b40c4043c516ff1be6c3edde58fe72403b7bd5a"} Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.617256 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77","Type":"ContainerDied","Data":"2fe35d24bc1d3d27af9a4655fdabb66456c49a9ceb2490a23a1ba0780efb881d"} Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.617265 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77","Type":"ContainerDied","Data":"a92b5efd1bb8370db0e4722bdcbbbf74e823704ada73990a656280dece5992f8"} Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.855401 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.989211 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-scripts\") pod \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.989302 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-run-httpd\") pod \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.989332 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsrhb\" (UniqueName: \"kubernetes.io/projected/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-kube-api-access-jsrhb\") pod \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.989383 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-config-data\") pod \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.989471 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-log-httpd\") pod \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.989517 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-combined-ca-bundle\") pod \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.989577 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-ceilometer-tls-certs\") pod \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.989619 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-sg-core-conf-yaml\") pod \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\" (UID: \"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77\") " Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.989747 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" (UID: "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.989995 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.990789 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" (UID: "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.996341 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-scripts" (OuterVolumeSpecName: "scripts") pod "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" (UID: "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:26 crc kubenswrapper[4794]: I1215 14:25:26.999911 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-kube-api-access-jsrhb" (OuterVolumeSpecName: "kube-api-access-jsrhb") pod "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" (UID: "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77"). InnerVolumeSpecName "kube-api-access-jsrhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.023396 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" (UID: "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.050388 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" (UID: "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.080306 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" (UID: "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.091460 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.091511 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.091526 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsrhb\" (UniqueName: \"kubernetes.io/projected/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-kube-api-access-jsrhb\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.091539 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.091551 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.091562 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.102399 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-config-data" (OuterVolumeSpecName: "config-data") pod "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" (UID: "cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.193422 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.627478 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77","Type":"ContainerDied","Data":"bff53d29d3c7d74d3f0cc1f3a03add8b559e46f79ccc61a8a17ba597eb906b8a"} Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.627567 4794 scope.go:117] "RemoveContainer" containerID="880605b1cba82b2ff72c88640026e2670e6e3776a9a7bd5d7669f6cd2ea1ed1d" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.627568 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.647604 4794 scope.go:117] "RemoveContainer" containerID="3258bbf8f0c3c481c2d97e008b40c4043c516ff1be6c3edde58fe72403b7bd5a" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.665497 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.672246 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.679865 4794 scope.go:117] "RemoveContainer" containerID="2fe35d24bc1d3d27af9a4655fdabb66456c49a9ceb2490a23a1ba0780efb881d" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.696968 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:27 crc kubenswrapper[4794]: E1215 14:25:27.697298 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a4772e-0b90-4c8f-9f25-a8aa83ea34d7" containerName="mariadb-database-create" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697323 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a4772e-0b90-4c8f-9f25-a8aa83ea34d7" containerName="mariadb-database-create" Dec 15 14:25:27 crc kubenswrapper[4794]: E1215 14:25:27.697342 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac6ba99-779b-42d6-8b40-6143b73e02c9" containerName="watcher-api" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697393 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac6ba99-779b-42d6-8b40-6143b73e02c9" containerName="watcher-api" Dec 15 14:25:27 crc kubenswrapper[4794]: E1215 14:25:27.697408 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="ceilometer-notification-agent" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697415 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="ceilometer-notification-agent" Dec 15 14:25:27 crc kubenswrapper[4794]: E1215 14:25:27.697425 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca2c196-d900-4878-8786-27133b208c91" containerName="watcher-applier" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697432 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca2c196-d900-4878-8786-27133b208c91" containerName="watcher-applier" Dec 15 14:25:27 crc kubenswrapper[4794]: E1215 14:25:27.697453 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="proxy-httpd" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697461 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="proxy-httpd" Dec 15 14:25:27 crc kubenswrapper[4794]: E1215 14:25:27.697474 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="sg-core" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697481 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="sg-core" Dec 15 14:25:27 crc kubenswrapper[4794]: E1215 14:25:27.697493 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac6ba99-779b-42d6-8b40-6143b73e02c9" containerName="watcher-kuttl-api-log" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697499 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac6ba99-779b-42d6-8b40-6143b73e02c9" containerName="watcher-kuttl-api-log" Dec 15 14:25:27 crc kubenswrapper[4794]: E1215 14:25:27.697511 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d49cd20c-9b89-4625-9bb5-cc8f88bc341d" containerName="watcher-decision-engine" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697517 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="d49cd20c-9b89-4625-9bb5-cc8f88bc341d" containerName="watcher-decision-engine" Dec 15 14:25:27 crc kubenswrapper[4794]: E1215 14:25:27.697524 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="ceilometer-central-agent" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697529 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="ceilometer-central-agent" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697694 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="ceilometer-notification-agent" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697702 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="d49cd20c-9b89-4625-9bb5-cc8f88bc341d" containerName="watcher-decision-engine" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697715 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a4772e-0b90-4c8f-9f25-a8aa83ea34d7" containerName="mariadb-database-create" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697725 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac6ba99-779b-42d6-8b40-6143b73e02c9" containerName="watcher-api" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697734 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="ceilometer-central-agent" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697743 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca2c196-d900-4878-8786-27133b208c91" containerName="watcher-applier" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697752 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="sg-core" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697765 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac6ba99-779b-42d6-8b40-6143b73e02c9" containerName="watcher-kuttl-api-log" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.697775 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" containerName="proxy-httpd" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.699154 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.701149 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.701476 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.701953 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.720708 4794 scope.go:117] "RemoveContainer" containerID="a92b5efd1bb8370db0e4722bdcbbbf74e823704ada73990a656280dece5992f8" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.726010 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.804549 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.804601 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-scripts\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.804650 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.804667 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-config-data\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.804700 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-run-httpd\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.804727 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-log-httpd\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.804758 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.804910 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2whpq\" (UniqueName: \"kubernetes.io/projected/60bdf2da-411f-483d-a415-477200c358e2-kube-api-access-2whpq\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.906987 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.907059 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2whpq\" (UniqueName: \"kubernetes.io/projected/60bdf2da-411f-483d-a415-477200c358e2-kube-api-access-2whpq\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.907140 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.907178 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-scripts\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.907230 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.907250 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-config-data\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.907273 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-run-httpd\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.907292 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-log-httpd\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.907989 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-run-httpd\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.908014 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-log-httpd\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.910921 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-scripts\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.911064 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.911219 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.911718 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-config-data\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.911782 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:27 crc kubenswrapper[4794]: I1215 14:25:27.932371 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2whpq\" (UniqueName: \"kubernetes.io/projected/60bdf2da-411f-483d-a415-477200c358e2-kube-api-access-2whpq\") pod \"ceilometer-0\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:28 crc kubenswrapper[4794]: I1215 14:25:28.023851 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:28 crc kubenswrapper[4794]: I1215 14:25:28.463702 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:28 crc kubenswrapper[4794]: I1215 14:25:28.639002 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"60bdf2da-411f-483d-a415-477200c358e2","Type":"ContainerStarted","Data":"006568e4ed5f48880fc7aa8623e2ff3bb38251086df4bae6f8282de9b4a82750"} Dec 15 14:25:28 crc kubenswrapper[4794]: I1215 14:25:28.747657 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77" path="/var/lib/kubelet/pods/cf7ceb0b-2c4a-4b0a-890d-9b8f3e70bf77/volumes" Dec 15 14:25:29 crc kubenswrapper[4794]: I1215 14:25:29.066168 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-gbb2p"] Dec 15 14:25:29 crc kubenswrapper[4794]: I1215 14:25:29.067536 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-gbb2p" Dec 15 14:25:29 crc kubenswrapper[4794]: I1215 14:25:29.074091 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Dec 15 14:25:29 crc kubenswrapper[4794]: I1215 14:25:29.103895 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-gbb2p"] Dec 15 14:25:29 crc kubenswrapper[4794]: I1215 14:25:29.124922 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f42xp\" (UniqueName: \"kubernetes.io/projected/0fe66d29-318e-409e-8ff0-d5bfafa2d782-kube-api-access-f42xp\") pod \"watcher-test-account-create-gbb2p\" (UID: \"0fe66d29-318e-409e-8ff0-d5bfafa2d782\") " pod="watcher-kuttl-default/watcher-test-account-create-gbb2p" Dec 15 14:25:29 crc kubenswrapper[4794]: I1215 14:25:29.226352 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f42xp\" (UniqueName: \"kubernetes.io/projected/0fe66d29-318e-409e-8ff0-d5bfafa2d782-kube-api-access-f42xp\") pod \"watcher-test-account-create-gbb2p\" (UID: \"0fe66d29-318e-409e-8ff0-d5bfafa2d782\") " pod="watcher-kuttl-default/watcher-test-account-create-gbb2p" Dec 15 14:25:29 crc kubenswrapper[4794]: I1215 14:25:29.248156 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f42xp\" (UniqueName: \"kubernetes.io/projected/0fe66d29-318e-409e-8ff0-d5bfafa2d782-kube-api-access-f42xp\") pod \"watcher-test-account-create-gbb2p\" (UID: \"0fe66d29-318e-409e-8ff0-d5bfafa2d782\") " pod="watcher-kuttl-default/watcher-test-account-create-gbb2p" Dec 15 14:25:29 crc kubenswrapper[4794]: I1215 14:25:29.489412 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-gbb2p" Dec 15 14:25:29 crc kubenswrapper[4794]: I1215 14:25:29.712219 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"60bdf2da-411f-483d-a415-477200c358e2","Type":"ContainerStarted","Data":"355aa55b93259d90dc3a07f7ca12198e64bc515f59c676928c2fbd482b87c604"} Dec 15 14:25:29 crc kubenswrapper[4794]: I1215 14:25:29.871692 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-gbb2p"] Dec 15 14:25:30 crc kubenswrapper[4794]: I1215 14:25:30.721434 4794 generic.go:334] "Generic (PLEG): container finished" podID="0fe66d29-318e-409e-8ff0-d5bfafa2d782" containerID="a1bee7fda2eebabc5ef6d0cd694d718e9a66482f94298892df5ca0bdcd76d1b2" exitCode=0 Dec 15 14:25:30 crc kubenswrapper[4794]: I1215 14:25:30.723058 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-gbb2p" event={"ID":"0fe66d29-318e-409e-8ff0-d5bfafa2d782","Type":"ContainerDied","Data":"a1bee7fda2eebabc5ef6d0cd694d718e9a66482f94298892df5ca0bdcd76d1b2"} Dec 15 14:25:30 crc kubenswrapper[4794]: I1215 14:25:30.723246 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-gbb2p" event={"ID":"0fe66d29-318e-409e-8ff0-d5bfafa2d782","Type":"ContainerStarted","Data":"e47831e0d6de6d1962c40e557f1bc6018a3f3fe2033b12ef6f755f2d9a9e8270"} Dec 15 14:25:31 crc kubenswrapper[4794]: I1215 14:25:31.738707 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"60bdf2da-411f-483d-a415-477200c358e2","Type":"ContainerStarted","Data":"e478c7d48a88d7b58af836cdc28e6acb736f51f3717ed204157220df429d16cc"} Dec 15 14:25:32 crc kubenswrapper[4794]: I1215 14:25:32.259749 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-gbb2p" Dec 15 14:25:32 crc kubenswrapper[4794]: I1215 14:25:32.403375 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f42xp\" (UniqueName: \"kubernetes.io/projected/0fe66d29-318e-409e-8ff0-d5bfafa2d782-kube-api-access-f42xp\") pod \"0fe66d29-318e-409e-8ff0-d5bfafa2d782\" (UID: \"0fe66d29-318e-409e-8ff0-d5bfafa2d782\") " Dec 15 14:25:32 crc kubenswrapper[4794]: I1215 14:25:32.409732 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fe66d29-318e-409e-8ff0-d5bfafa2d782-kube-api-access-f42xp" (OuterVolumeSpecName: "kube-api-access-f42xp") pod "0fe66d29-318e-409e-8ff0-d5bfafa2d782" (UID: "0fe66d29-318e-409e-8ff0-d5bfafa2d782"). InnerVolumeSpecName "kube-api-access-f42xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:25:32 crc kubenswrapper[4794]: I1215 14:25:32.504935 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f42xp\" (UniqueName: \"kubernetes.io/projected/0fe66d29-318e-409e-8ff0-d5bfafa2d782-kube-api-access-f42xp\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:32 crc kubenswrapper[4794]: I1215 14:25:32.756607 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"60bdf2da-411f-483d-a415-477200c358e2","Type":"ContainerStarted","Data":"0e2eee288141a28506b3f3ab29dbac9c8fcdc1c9eaa37d7f6105744c5ecaacd2"} Dec 15 14:25:32 crc kubenswrapper[4794]: I1215 14:25:32.758307 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-gbb2p" event={"ID":"0fe66d29-318e-409e-8ff0-d5bfafa2d782","Type":"ContainerDied","Data":"e47831e0d6de6d1962c40e557f1bc6018a3f3fe2033b12ef6f755f2d9a9e8270"} Dec 15 14:25:32 crc kubenswrapper[4794]: I1215 14:25:32.758333 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e47831e0d6de6d1962c40e557f1bc6018a3f3fe2033b12ef6f755f2d9a9e8270" Dec 15 14:25:32 crc kubenswrapper[4794]: I1215 14:25:32.758356 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-gbb2p" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.244301 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r"] Dec 15 14:25:34 crc kubenswrapper[4794]: E1215 14:25:34.245047 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe66d29-318e-409e-8ff0-d5bfafa2d782" containerName="mariadb-account-create" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.245063 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe66d29-318e-409e-8ff0-d5bfafa2d782" containerName="mariadb-account-create" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.245243 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe66d29-318e-409e-8ff0-d5bfafa2d782" containerName="mariadb-account-create" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.245896 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.248333 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.252708 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-v7l6x" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.312458 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r"] Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.335473 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.335532 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97g72\" (UniqueName: \"kubernetes.io/projected/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-kube-api-access-97g72\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.335556 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-config-data\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.342652 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-db-sync-config-data\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.444446 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-db-sync-config-data\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.444910 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.444950 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97g72\" (UniqueName: \"kubernetes.io/projected/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-kube-api-access-97g72\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.444980 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-config-data\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.452586 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-config-data\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.453964 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-db-sync-config-data\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.455125 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.465829 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97g72\" (UniqueName: \"kubernetes.io/projected/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-kube-api-access-97g72\") pod \"watcher-kuttl-db-sync-7tf8r\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:34 crc kubenswrapper[4794]: I1215 14:25:34.565055 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:35 crc kubenswrapper[4794]: I1215 14:25:35.008954 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r"] Dec 15 14:25:35 crc kubenswrapper[4794]: W1215 14:25:35.018471 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c2ac705_11fa_4ee0_a422_a695ff5cf5bf.slice/crio-0c1fa131bee85e1c341348fce97feda4a74077edd97fed98ff9862e284425660 WatchSource:0}: Error finding container 0c1fa131bee85e1c341348fce97feda4a74077edd97fed98ff9862e284425660: Status 404 returned error can't find the container with id 0c1fa131bee85e1c341348fce97feda4a74077edd97fed98ff9862e284425660 Dec 15 14:25:35 crc kubenswrapper[4794]: I1215 14:25:35.787479 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"60bdf2da-411f-483d-a415-477200c358e2","Type":"ContainerStarted","Data":"50d80cfb70b0a1253db0b3343804d25e4c7547107837898eae340a1f3ca98baf"} Dec 15 14:25:35 crc kubenswrapper[4794]: I1215 14:25:35.787862 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:35 crc kubenswrapper[4794]: I1215 14:25:35.790500 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" event={"ID":"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf","Type":"ContainerStarted","Data":"9678a2af744f197f5c759a672f274ca8c7cdf9b80a380d1068fd064d98e1b73e"} Dec 15 14:25:35 crc kubenswrapper[4794]: I1215 14:25:35.790537 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" event={"ID":"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf","Type":"ContainerStarted","Data":"0c1fa131bee85e1c341348fce97feda4a74077edd97fed98ff9862e284425660"} Dec 15 14:25:35 crc kubenswrapper[4794]: I1215 14:25:35.813739 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.9211446479999998 podStartE2EDuration="8.813719234s" podCreationTimestamp="2025-12-15 14:25:27 +0000 UTC" firstStartedPulling="2025-12-15 14:25:28.471156716 +0000 UTC m=+1890.323179154" lastFinishedPulling="2025-12-15 14:25:35.363731302 +0000 UTC m=+1897.215753740" observedRunningTime="2025-12-15 14:25:35.808515587 +0000 UTC m=+1897.660538045" watchObservedRunningTime="2025-12-15 14:25:35.813719234 +0000 UTC m=+1897.665741672" Dec 15 14:25:35 crc kubenswrapper[4794]: I1215 14:25:35.839869 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" podStartSLOduration=1.839846624 podStartE2EDuration="1.839846624s" podCreationTimestamp="2025-12-15 14:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:25:35.834831462 +0000 UTC m=+1897.686853900" watchObservedRunningTime="2025-12-15 14:25:35.839846624 +0000 UTC m=+1897.691869062" Dec 15 14:25:37 crc kubenswrapper[4794]: I1215 14:25:37.808457 4794 generic.go:334] "Generic (PLEG): container finished" podID="2c2ac705-11fa-4ee0-a422-a695ff5cf5bf" containerID="9678a2af744f197f5c759a672f274ca8c7cdf9b80a380d1068fd064d98e1b73e" exitCode=0 Dec 15 14:25:37 crc kubenswrapper[4794]: I1215 14:25:37.808517 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" event={"ID":"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf","Type":"ContainerDied","Data":"9678a2af744f197f5c759a672f274ca8c7cdf9b80a380d1068fd064d98e1b73e"} Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.124861 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.217889 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-db-sync-config-data\") pod \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.218290 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-config-data\") pod \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.218848 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-combined-ca-bundle\") pod \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.218949 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97g72\" (UniqueName: \"kubernetes.io/projected/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-kube-api-access-97g72\") pod \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\" (UID: \"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf\") " Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.224735 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-kube-api-access-97g72" (OuterVolumeSpecName: "kube-api-access-97g72") pod "2c2ac705-11fa-4ee0-a422-a695ff5cf5bf" (UID: "2c2ac705-11fa-4ee0-a422-a695ff5cf5bf"). InnerVolumeSpecName "kube-api-access-97g72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.230654 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2c2ac705-11fa-4ee0-a422-a695ff5cf5bf" (UID: "2c2ac705-11fa-4ee0-a422-a695ff5cf5bf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.240677 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c2ac705-11fa-4ee0-a422-a695ff5cf5bf" (UID: "2c2ac705-11fa-4ee0-a422-a695ff5cf5bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.256810 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-config-data" (OuterVolumeSpecName: "config-data") pod "2c2ac705-11fa-4ee0-a422-a695ff5cf5bf" (UID: "2c2ac705-11fa-4ee0-a422-a695ff5cf5bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.321166 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.321205 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.321220 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97g72\" (UniqueName: \"kubernetes.io/projected/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-kube-api-access-97g72\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.321232 4794 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.829768 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" event={"ID":"2c2ac705-11fa-4ee0-a422-a695ff5cf5bf","Type":"ContainerDied","Data":"0c1fa131bee85e1c341348fce97feda4a74077edd97fed98ff9862e284425660"} Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.829815 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c1fa131bee85e1c341348fce97feda4a74077edd97fed98ff9862e284425660" Dec 15 14:25:39 crc kubenswrapper[4794]: I1215 14:25:39.829817 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.390156 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:25:40 crc kubenswrapper[4794]: E1215 14:25:40.391317 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2ac705-11fa-4ee0-a422-a695ff5cf5bf" containerName="watcher-kuttl-db-sync" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.391390 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2ac705-11fa-4ee0-a422-a695ff5cf5bf" containerName="watcher-kuttl-db-sync" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.391613 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2ac705-11fa-4ee0-a422-a695ff5cf5bf" containerName="watcher-kuttl-db-sync" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.392475 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.395125 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-v7l6x" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.395363 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.406026 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.425844 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.427658 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.449965 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.451217 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.454269 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.462039 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.473683 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.502427 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.503787 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.508663 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.534530 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.540366 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.540422 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.540559 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-logs\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.540615 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.540657 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.540697 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.540805 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr87l\" (UniqueName: \"kubernetes.io/projected/112e9d28-233f-460a-856a-04778ef398d0-kube-api-access-pr87l\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.540834 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.540868 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-962lf\" (UniqueName: \"kubernetes.io/projected/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-kube-api-access-962lf\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.540931 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.541081 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.541127 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.541180 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112e9d28-233f-460a-856a-04778ef398d0-logs\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.541219 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.541233 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb70af09-6df0-4675-ae4a-6b460676478e-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.541267 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.541294 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcfwc\" (UniqueName: \"kubernetes.io/projected/eb70af09-6df0-4675-ae4a-6b460676478e-kube-api-access-rcfwc\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.642883 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.642947 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.642966 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643002 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfpjc\" (UniqueName: \"kubernetes.io/projected/052edc18-54e2-4ed4-8913-6a1288c0b12e-kube-api-access-mfpjc\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643129 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-logs\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643187 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643252 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643293 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643337 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643377 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643405 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr87l\" (UniqueName: \"kubernetes.io/projected/112e9d28-233f-460a-856a-04778ef398d0-kube-api-access-pr87l\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643437 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/052edc18-54e2-4ed4-8913-6a1288c0b12e-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643460 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643507 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-962lf\" (UniqueName: \"kubernetes.io/projected/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-kube-api-access-962lf\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643650 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643693 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643724 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643748 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643788 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112e9d28-233f-460a-856a-04778ef398d0-logs\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643833 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643853 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb70af09-6df0-4675-ae4a-6b460676478e-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643891 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643916 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcfwc\" (UniqueName: \"kubernetes.io/projected/eb70af09-6df0-4675-ae4a-6b460676478e-kube-api-access-rcfwc\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.643510 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-logs\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.645107 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112e9d28-233f-460a-856a-04778ef398d0-logs\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.645239 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb70af09-6df0-4675-ae4a-6b460676478e-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.647702 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.647922 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.650618 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.652176 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.657512 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.657912 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.658073 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.658311 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.658580 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.658917 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.659084 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.661334 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcfwc\" (UniqueName: \"kubernetes.io/projected/eb70af09-6df0-4675-ae4a-6b460676478e-kube-api-access-rcfwc\") pod \"watcher-kuttl-applier-0\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.662293 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr87l\" (UniqueName: \"kubernetes.io/projected/112e9d28-233f-460a-856a-04778ef398d0-kube-api-access-pr87l\") pod \"watcher-kuttl-api-0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.663134 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-962lf\" (UniqueName: \"kubernetes.io/projected/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-kube-api-access-962lf\") pod \"watcher-kuttl-api-1\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.714092 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.745374 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.745410 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.746073 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/052edc18-54e2-4ed4-8913-6a1288c0b12e-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.746160 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.746221 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.746240 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfpjc\" (UniqueName: \"kubernetes.io/projected/052edc18-54e2-4ed4-8913-6a1288c0b12e-kube-api-access-mfpjc\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.746356 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.746997 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/052edc18-54e2-4ed4-8913-6a1288c0b12e-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.749341 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.750695 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.759277 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.759430 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.770476 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfpjc\" (UniqueName: \"kubernetes.io/projected/052edc18-54e2-4ed4-8913-6a1288c0b12e-kube-api-access-mfpjc\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.772516 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:40 crc kubenswrapper[4794]: I1215 14:25:40.874341 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:41 crc kubenswrapper[4794]: I1215 14:25:41.380622 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:25:41 crc kubenswrapper[4794]: I1215 14:25:41.446870 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:25:41 crc kubenswrapper[4794]: W1215 14:25:41.460814 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod112e9d28_233f_460a_856a_04778ef398d0.slice/crio-86f191a9d33be2a0881743f6839f1ad5ecca836a813593b5ce6eb66375a5547a WatchSource:0}: Error finding container 86f191a9d33be2a0881743f6839f1ad5ecca836a813593b5ce6eb66375a5547a: Status 404 returned error can't find the container with id 86f191a9d33be2a0881743f6839f1ad5ecca836a813593b5ce6eb66375a5547a Dec 15 14:25:41 crc kubenswrapper[4794]: I1215 14:25:41.550480 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:25:41 crc kubenswrapper[4794]: W1215 14:25:41.556857 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb70af09_6df0_4675_ae4a_6b460676478e.slice/crio-91cbc27823e9fe5869d8566a7833cbf5c383776e5fb3c264b7fc5935dd9abdc4 WatchSource:0}: Error finding container 91cbc27823e9fe5869d8566a7833cbf5c383776e5fb3c264b7fc5935dd9abdc4: Status 404 returned error can't find the container with id 91cbc27823e9fe5869d8566a7833cbf5c383776e5fb3c264b7fc5935dd9abdc4 Dec 15 14:25:41 crc kubenswrapper[4794]: I1215 14:25:41.626692 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:25:41 crc kubenswrapper[4794]: W1215 14:25:41.639832 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod052edc18_54e2_4ed4_8913_6a1288c0b12e.slice/crio-4b9ce2ba9e92202c2b521f70b6177b8e8cffd07fbd20eb58de1bd8e05bc7c1dc WatchSource:0}: Error finding container 4b9ce2ba9e92202c2b521f70b6177b8e8cffd07fbd20eb58de1bd8e05bc7c1dc: Status 404 returned error can't find the container with id 4b9ce2ba9e92202c2b521f70b6177b8e8cffd07fbd20eb58de1bd8e05bc7c1dc Dec 15 14:25:41 crc kubenswrapper[4794]: I1215 14:25:41.942964 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"112e9d28-233f-460a-856a-04778ef398d0","Type":"ContainerStarted","Data":"86f191a9d33be2a0881743f6839f1ad5ecca836a813593b5ce6eb66375a5547a"} Dec 15 14:25:41 crc kubenswrapper[4794]: I1215 14:25:41.944351 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"eb70af09-6df0-4675-ae4a-6b460676478e","Type":"ContainerStarted","Data":"91cbc27823e9fe5869d8566a7833cbf5c383776e5fb3c264b7fc5935dd9abdc4"} Dec 15 14:25:41 crc kubenswrapper[4794]: I1215 14:25:41.945493 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"a2a4eea6-4594-41ff-ba05-f09d845ccbe5","Type":"ContainerStarted","Data":"fa69ed245baee4569c56dde38d96906f7950b35374b96d427c8ba975c7bae7d5"} Dec 15 14:25:41 crc kubenswrapper[4794]: I1215 14:25:41.946317 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"052edc18-54e2-4ed4-8913-6a1288c0b12e","Type":"ContainerStarted","Data":"4b9ce2ba9e92202c2b521f70b6177b8e8cffd07fbd20eb58de1bd8e05bc7c1dc"} Dec 15 14:25:42 crc kubenswrapper[4794]: I1215 14:25:42.957238 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"112e9d28-233f-460a-856a-04778ef398d0","Type":"ContainerStarted","Data":"83c648c206ea539fc89d990490f8e824b54447a8aa55baa4e667d59632ff11f0"} Dec 15 14:25:42 crc kubenswrapper[4794]: I1215 14:25:42.957601 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:42 crc kubenswrapper[4794]: I1215 14:25:42.957615 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"112e9d28-233f-460a-856a-04778ef398d0","Type":"ContainerStarted","Data":"c4dda8c40d51bbf29e2f7410778e55955b4618b3737d50027ea25e8f58b49d7b"} Dec 15 14:25:42 crc kubenswrapper[4794]: I1215 14:25:42.959046 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="112e9d28-233f-460a-856a-04778ef398d0" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.220:9322/\": dial tcp 10.217.0.220:9322: connect: connection refused" Dec 15 14:25:42 crc kubenswrapper[4794]: I1215 14:25:42.959505 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"eb70af09-6df0-4675-ae4a-6b460676478e","Type":"ContainerStarted","Data":"e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a"} Dec 15 14:25:42 crc kubenswrapper[4794]: I1215 14:25:42.962058 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"a2a4eea6-4594-41ff-ba05-f09d845ccbe5","Type":"ContainerStarted","Data":"7611ababa19817f26008b86f91f3b122125c842ba489bc600b2cbc64153faf13"} Dec 15 14:25:42 crc kubenswrapper[4794]: I1215 14:25:42.962104 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"a2a4eea6-4594-41ff-ba05-f09d845ccbe5","Type":"ContainerStarted","Data":"5d45c714ff581fb6879ea02ad616cca0b2ad826bb90814c31be88025e79bd9d1"} Dec 15 14:25:42 crc kubenswrapper[4794]: I1215 14:25:42.962718 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:42 crc kubenswrapper[4794]: I1215 14:25:42.963285 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.221:9322/\": dial tcp 10.217.0.221:9322: connect: connection refused" Dec 15 14:25:42 crc kubenswrapper[4794]: I1215 14:25:42.964605 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"052edc18-54e2-4ed4-8913-6a1288c0b12e","Type":"ContainerStarted","Data":"38ecefd4cee8659ac403a64c32995829abebe7e6d1637176048a08a5ed2e4cef"} Dec 15 14:25:42 crc kubenswrapper[4794]: I1215 14:25:42.986714 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.986691744 podStartE2EDuration="2.986691744s" podCreationTimestamp="2025-12-15 14:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:25:42.982840105 +0000 UTC m=+1904.834862563" watchObservedRunningTime="2025-12-15 14:25:42.986691744 +0000 UTC m=+1904.838714192" Dec 15 14:25:43 crc kubenswrapper[4794]: I1215 14:25:43.014315 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=3.014295705 podStartE2EDuration="3.014295705s" podCreationTimestamp="2025-12-15 14:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:25:43.003643283 +0000 UTC m=+1904.855665741" watchObservedRunningTime="2025-12-15 14:25:43.014295705 +0000 UTC m=+1904.866318143" Dec 15 14:25:43 crc kubenswrapper[4794]: I1215 14:25:43.026500 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=3.02648459 podStartE2EDuration="3.02648459s" podCreationTimestamp="2025-12-15 14:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:25:43.022856057 +0000 UTC m=+1904.874878515" watchObservedRunningTime="2025-12-15 14:25:43.02648459 +0000 UTC m=+1904.878507028" Dec 15 14:25:45 crc kubenswrapper[4794]: I1215 14:25:45.715140 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:45 crc kubenswrapper[4794]: I1215 14:25:45.748738 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:45 crc kubenswrapper[4794]: I1215 14:25:45.773448 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:46 crc kubenswrapper[4794]: I1215 14:25:46.251834 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:46 crc kubenswrapper[4794]: I1215 14:25:46.283175 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-1" podStartSLOduration=6.283154244 podStartE2EDuration="6.283154244s" podCreationTimestamp="2025-12-15 14:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-15 14:25:43.053214316 +0000 UTC m=+1904.905236764" watchObservedRunningTime="2025-12-15 14:25:46.283154244 +0000 UTC m=+1908.135176692" Dec 15 14:25:46 crc kubenswrapper[4794]: I1215 14:25:46.570452 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:50 crc kubenswrapper[4794]: I1215 14:25:50.714518 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:50 crc kubenswrapper[4794]: I1215 14:25:50.720075 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:50 crc kubenswrapper[4794]: I1215 14:25:50.751221 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:50 crc kubenswrapper[4794]: I1215 14:25:50.757213 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:50 crc kubenswrapper[4794]: I1215 14:25:50.774823 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:50 crc kubenswrapper[4794]: I1215 14:25:50.809100 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:50 crc kubenswrapper[4794]: I1215 14:25:50.877277 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:50 crc kubenswrapper[4794]: I1215 14:25:50.913851 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:51 crc kubenswrapper[4794]: I1215 14:25:51.036508 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:51 crc kubenswrapper[4794]: I1215 14:25:51.042896 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:25:51 crc kubenswrapper[4794]: I1215 14:25:51.042954 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:25:51 crc kubenswrapper[4794]: I1215 14:25:51.079913 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:25:51 crc kubenswrapper[4794]: I1215 14:25:51.080251 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:25:53 crc kubenswrapper[4794]: I1215 14:25:53.440312 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:53 crc kubenswrapper[4794]: I1215 14:25:53.441012 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="sg-core" containerID="cri-o://0e2eee288141a28506b3f3ab29dbac9c8fcdc1c9eaa37d7f6105744c5ecaacd2" gracePeriod=30 Dec 15 14:25:53 crc kubenswrapper[4794]: I1215 14:25:53.441061 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="proxy-httpd" containerID="cri-o://50d80cfb70b0a1253db0b3343804d25e4c7547107837898eae340a1f3ca98baf" gracePeriod=30 Dec 15 14:25:53 crc kubenswrapper[4794]: I1215 14:25:53.441033 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="ceilometer-notification-agent" containerID="cri-o://e478c7d48a88d7b58af836cdc28e6acb736f51f3717ed204157220df429d16cc" gracePeriod=30 Dec 15 14:25:53 crc kubenswrapper[4794]: I1215 14:25:53.440967 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="ceilometer-central-agent" containerID="cri-o://355aa55b93259d90dc3a07f7ca12198e64bc515f59c676928c2fbd482b87c604" gracePeriod=30 Dec 15 14:25:53 crc kubenswrapper[4794]: I1215 14:25:53.540676 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.217:3000/\": read tcp 10.217.0.2:49370->10.217.0.217:3000: read: connection reset by peer" Dec 15 14:25:54 crc kubenswrapper[4794]: I1215 14:25:54.064155 4794 generic.go:334] "Generic (PLEG): container finished" podID="60bdf2da-411f-483d-a415-477200c358e2" containerID="50d80cfb70b0a1253db0b3343804d25e4c7547107837898eae340a1f3ca98baf" exitCode=0 Dec 15 14:25:54 crc kubenswrapper[4794]: I1215 14:25:54.064406 4794 generic.go:334] "Generic (PLEG): container finished" podID="60bdf2da-411f-483d-a415-477200c358e2" containerID="0e2eee288141a28506b3f3ab29dbac9c8fcdc1c9eaa37d7f6105744c5ecaacd2" exitCode=2 Dec 15 14:25:54 crc kubenswrapper[4794]: I1215 14:25:54.064416 4794 generic.go:334] "Generic (PLEG): container finished" podID="60bdf2da-411f-483d-a415-477200c358e2" containerID="355aa55b93259d90dc3a07f7ca12198e64bc515f59c676928c2fbd482b87c604" exitCode=0 Dec 15 14:25:54 crc kubenswrapper[4794]: I1215 14:25:54.064267 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"60bdf2da-411f-483d-a415-477200c358e2","Type":"ContainerDied","Data":"50d80cfb70b0a1253db0b3343804d25e4c7547107837898eae340a1f3ca98baf"} Dec 15 14:25:54 crc kubenswrapper[4794]: I1215 14:25:54.064448 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"60bdf2da-411f-483d-a415-477200c358e2","Type":"ContainerDied","Data":"0e2eee288141a28506b3f3ab29dbac9c8fcdc1c9eaa37d7f6105744c5ecaacd2"} Dec 15 14:25:54 crc kubenswrapper[4794]: I1215 14:25:54.064461 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"60bdf2da-411f-483d-a415-477200c358e2","Type":"ContainerDied","Data":"355aa55b93259d90dc3a07f7ca12198e64bc515f59c676928c2fbd482b87c604"} Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.098140 4794 generic.go:334] "Generic (PLEG): container finished" podID="60bdf2da-411f-483d-a415-477200c358e2" containerID="e478c7d48a88d7b58af836cdc28e6acb736f51f3717ed204157220df429d16cc" exitCode=0 Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.098229 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"60bdf2da-411f-483d-a415-477200c358e2","Type":"ContainerDied","Data":"e478c7d48a88d7b58af836cdc28e6acb736f51f3717ed204157220df429d16cc"} Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.098728 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"60bdf2da-411f-483d-a415-477200c358e2","Type":"ContainerDied","Data":"006568e4ed5f48880fc7aa8623e2ff3bb38251086df4bae6f8282de9b4a82750"} Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.098746 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="006568e4ed5f48880fc7aa8623e2ff3bb38251086df4bae6f8282de9b4a82750" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.119917 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.193981 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-run-httpd\") pod \"60bdf2da-411f-483d-a415-477200c358e2\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.194089 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-ceilometer-tls-certs\") pod \"60bdf2da-411f-483d-a415-477200c358e2\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.194139 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-sg-core-conf-yaml\") pod \"60bdf2da-411f-483d-a415-477200c358e2\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.194224 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-scripts\") pod \"60bdf2da-411f-483d-a415-477200c358e2\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.194270 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2whpq\" (UniqueName: \"kubernetes.io/projected/60bdf2da-411f-483d-a415-477200c358e2-kube-api-access-2whpq\") pod \"60bdf2da-411f-483d-a415-477200c358e2\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.194293 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-log-httpd\") pod \"60bdf2da-411f-483d-a415-477200c358e2\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.194337 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-combined-ca-bundle\") pod \"60bdf2da-411f-483d-a415-477200c358e2\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.194363 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-config-data\") pod \"60bdf2da-411f-483d-a415-477200c358e2\" (UID: \"60bdf2da-411f-483d-a415-477200c358e2\") " Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.194959 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "60bdf2da-411f-483d-a415-477200c358e2" (UID: "60bdf2da-411f-483d-a415-477200c358e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.195020 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "60bdf2da-411f-483d-a415-477200c358e2" (UID: "60bdf2da-411f-483d-a415-477200c358e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.195330 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.195350 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60bdf2da-411f-483d-a415-477200c358e2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.200768 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-scripts" (OuterVolumeSpecName: "scripts") pod "60bdf2da-411f-483d-a415-477200c358e2" (UID: "60bdf2da-411f-483d-a415-477200c358e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.200824 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60bdf2da-411f-483d-a415-477200c358e2-kube-api-access-2whpq" (OuterVolumeSpecName: "kube-api-access-2whpq") pod "60bdf2da-411f-483d-a415-477200c358e2" (UID: "60bdf2da-411f-483d-a415-477200c358e2"). InnerVolumeSpecName "kube-api-access-2whpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.224624 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "60bdf2da-411f-483d-a415-477200c358e2" (UID: "60bdf2da-411f-483d-a415-477200c358e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.262244 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "60bdf2da-411f-483d-a415-477200c358e2" (UID: "60bdf2da-411f-483d-a415-477200c358e2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.277975 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60bdf2da-411f-483d-a415-477200c358e2" (UID: "60bdf2da-411f-483d-a415-477200c358e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.297230 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.297262 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2whpq\" (UniqueName: \"kubernetes.io/projected/60bdf2da-411f-483d-a415-477200c358e2-kube-api-access-2whpq\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.297277 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.297287 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.297297 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.308499 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-config-data" (OuterVolumeSpecName: "config-data") pod "60bdf2da-411f-483d-a415-477200c358e2" (UID: "60bdf2da-411f-483d-a415-477200c358e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:25:58 crc kubenswrapper[4794]: I1215 14:25:58.399016 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdf2da-411f-483d-a415-477200c358e2-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.107223 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.131704 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.141237 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.160353 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:59 crc kubenswrapper[4794]: E1215 14:25:59.160915 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="ceilometer-central-agent" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.160946 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="ceilometer-central-agent" Dec 15 14:25:59 crc kubenswrapper[4794]: E1215 14:25:59.160977 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="ceilometer-notification-agent" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.160990 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="ceilometer-notification-agent" Dec 15 14:25:59 crc kubenswrapper[4794]: E1215 14:25:59.161032 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="proxy-httpd" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.161046 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="proxy-httpd" Dec 15 14:25:59 crc kubenswrapper[4794]: E1215 14:25:59.161066 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="sg-core" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.161077 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="sg-core" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.161313 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="sg-core" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.161334 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="ceilometer-notification-agent" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.161359 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="proxy-httpd" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.161388 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="ceilometer-central-agent" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.163561 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.165849 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.165880 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.166455 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.171368 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.215004 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-scripts\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.215076 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-run-httpd\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.215121 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.215175 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh2d7\" (UniqueName: \"kubernetes.io/projected/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-kube-api-access-kh2d7\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.215201 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.215257 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.215291 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-log-httpd\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.215320 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-config-data\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.316769 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-run-httpd\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.316831 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.316881 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh2d7\" (UniqueName: \"kubernetes.io/projected/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-kube-api-access-kh2d7\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.316905 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.316954 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.316984 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-log-httpd\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.317010 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-config-data\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.317058 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-scripts\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.318632 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-run-httpd\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.318632 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-log-httpd\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.320985 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.321169 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-scripts\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.321725 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.322412 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.338687 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-config-data\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.348244 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh2d7\" (UniqueName: \"kubernetes.io/projected/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-kube-api-access-kh2d7\") pod \"ceilometer-0\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.494953 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:25:59 crc kubenswrapper[4794]: I1215 14:25:59.846948 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.131914 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl"] Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.133228 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.135428 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"59dc6449-7d86-451a-aa7f-0e1e1c8022ea","Type":"ContainerStarted","Data":"c164c07b86ecb939ba6211125a5c380a0023a81eabc19ca98d9be6b42725922b"} Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.135753 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.136000 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-scripts" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.186752 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl"] Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.276026 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-config-data\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.276082 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hpv7\" (UniqueName: \"kubernetes.io/projected/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-kube-api-access-2hpv7\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.276152 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.276241 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-scripts-volume\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.377394 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hpv7\" (UniqueName: \"kubernetes.io/projected/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-kube-api-access-2hpv7\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.377495 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.377606 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-scripts-volume\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.377644 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-config-data\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.383080 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-scripts-volume\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.383379 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-config-data\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.386563 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.392276 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hpv7\" (UniqueName: \"kubernetes.io/projected/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-kube-api-access-2hpv7\") pod \"watcher-kuttl-db-purge-29430146-jqxbl\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.453739 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.748656 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60bdf2da-411f-483d-a415-477200c358e2" path="/var/lib/kubelet/pods/60bdf2da-411f-483d-a415-477200c358e2/volumes" Dec 15 14:26:00 crc kubenswrapper[4794]: I1215 14:26:00.987006 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl"] Dec 15 14:26:00 crc kubenswrapper[4794]: W1215 14:26:00.999009 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec005cef_8fdc_46ce_b5a5_b0411c73f3fc.slice/crio-fa07105cc34228bcca656e3398efdbc0d4e986ba56e24e1a58b26623da930e4a WatchSource:0}: Error finding container fa07105cc34228bcca656e3398efdbc0d4e986ba56e24e1a58b26623da930e4a: Status 404 returned error can't find the container with id fa07105cc34228bcca656e3398efdbc0d4e986ba56e24e1a58b26623da930e4a Dec 15 14:26:01 crc kubenswrapper[4794]: I1215 14:26:01.145091 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" event={"ID":"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc","Type":"ContainerStarted","Data":"fa07105cc34228bcca656e3398efdbc0d4e986ba56e24e1a58b26623da930e4a"} Dec 15 14:26:02 crc kubenswrapper[4794]: I1215 14:26:02.153549 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" event={"ID":"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc","Type":"ContainerStarted","Data":"5bc0bf451dbb8e67f21c2d45ff97e849b96c3ac3bedd2b5bdebbcd6fd0d06181"} Dec 15 14:26:02 crc kubenswrapper[4794]: I1215 14:26:02.155633 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"59dc6449-7d86-451a-aa7f-0e1e1c8022ea","Type":"ContainerStarted","Data":"5fa16a04017325a7c76ff81534dfa9b060a01d93f3b9402c2f67550056ae38ec"} Dec 15 14:26:03 crc kubenswrapper[4794]: E1215 14:26:03.501450 4794 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.151:47448->38.102.83.151:42789: read tcp 38.102.83.151:47448->38.102.83.151:42789: read: connection reset by peer Dec 15 14:26:04 crc kubenswrapper[4794]: I1215 14:26:04.174753 4794 generic.go:334] "Generic (PLEG): container finished" podID="ec005cef-8fdc-46ce-b5a5-b0411c73f3fc" containerID="5bc0bf451dbb8e67f21c2d45ff97e849b96c3ac3bedd2b5bdebbcd6fd0d06181" exitCode=0 Dec 15 14:26:04 crc kubenswrapper[4794]: I1215 14:26:04.174824 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" event={"ID":"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc","Type":"ContainerDied","Data":"5bc0bf451dbb8e67f21c2d45ff97e849b96c3ac3bedd2b5bdebbcd6fd0d06181"} Dec 15 14:26:04 crc kubenswrapper[4794]: I1215 14:26:04.176747 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"59dc6449-7d86-451a-aa7f-0e1e1c8022ea","Type":"ContainerStarted","Data":"8071aa6ef5b48185e1f50da4ba6f34a0fbff42c36283206c02b64c92340f8ed2"} Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.555822 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.597697 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-scripts-volume\") pod \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.597996 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hpv7\" (UniqueName: \"kubernetes.io/projected/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-kube-api-access-2hpv7\") pod \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.598170 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-config-data\") pod \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.598275 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-combined-ca-bundle\") pod \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\" (UID: \"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc\") " Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.603313 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-scripts-volume" (OuterVolumeSpecName: "scripts-volume") pod "ec005cef-8fdc-46ce-b5a5-b0411c73f3fc" (UID: "ec005cef-8fdc-46ce-b5a5-b0411c73f3fc"). InnerVolumeSpecName "scripts-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.610478 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-kube-api-access-2hpv7" (OuterVolumeSpecName: "kube-api-access-2hpv7") pod "ec005cef-8fdc-46ce-b5a5-b0411c73f3fc" (UID: "ec005cef-8fdc-46ce-b5a5-b0411c73f3fc"). InnerVolumeSpecName "kube-api-access-2hpv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.628533 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec005cef-8fdc-46ce-b5a5-b0411c73f3fc" (UID: "ec005cef-8fdc-46ce-b5a5-b0411c73f3fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.653258 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-config-data" (OuterVolumeSpecName: "config-data") pod "ec005cef-8fdc-46ce-b5a5-b0411c73f3fc" (UID: "ec005cef-8fdc-46ce-b5a5-b0411c73f3fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.700246 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.700307 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.700321 4794 reconciler_common.go:293] "Volume detached for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-scripts-volume\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:05 crc kubenswrapper[4794]: I1215 14:26:05.700333 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hpv7\" (UniqueName: \"kubernetes.io/projected/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc-kube-api-access-2hpv7\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:06 crc kubenswrapper[4794]: I1215 14:26:06.195452 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" event={"ID":"ec005cef-8fdc-46ce-b5a5-b0411c73f3fc","Type":"ContainerDied","Data":"fa07105cc34228bcca656e3398efdbc0d4e986ba56e24e1a58b26623da930e4a"} Dec 15 14:26:06 crc kubenswrapper[4794]: I1215 14:26:06.195674 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa07105cc34228bcca656e3398efdbc0d4e986ba56e24e1a58b26623da930e4a" Dec 15 14:26:06 crc kubenswrapper[4794]: I1215 14:26:06.195690 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl" Dec 15 14:26:06 crc kubenswrapper[4794]: I1215 14:26:06.198872 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"59dc6449-7d86-451a-aa7f-0e1e1c8022ea","Type":"ContainerStarted","Data":"ef222b6613dc815435671964e0e5c8e13e81ab64a7ebfca5e243b371d2aeda35"} Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.616763 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r"] Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.626516 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-7tf8r"] Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.634566 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl"] Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.644907 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29430146-jqxbl"] Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.671280 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-2tdz9"] Dec 15 14:26:08 crc kubenswrapper[4794]: E1215 14:26:08.671633 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec005cef-8fdc-46ce-b5a5-b0411c73f3fc" containerName="watcher-db-manage" Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.671648 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec005cef-8fdc-46ce-b5a5-b0411c73f3fc" containerName="watcher-db-manage" Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.671838 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec005cef-8fdc-46ce-b5a5-b0411c73f3fc" containerName="watcher-db-manage" Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.672403 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-2tdz9" Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.697605 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-2tdz9"] Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.723330 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.723533 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="052edc18-54e2-4ed4-8913-6a1288c0b12e" containerName="watcher-decision-engine" containerID="cri-o://38ecefd4cee8659ac403a64c32995829abebe7e6d1637176048a08a5ed2e4cef" gracePeriod=30 Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.746075 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2ac705-11fa-4ee0-a422-a695ff5cf5bf" path="/var/lib/kubelet/pods/2c2ac705-11fa-4ee0-a422-a695ff5cf5bf/volumes" Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.746698 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec005cef-8fdc-46ce-b5a5-b0411c73f3fc" path="/var/lib/kubelet/pods/ec005cef-8fdc-46ce-b5a5-b0411c73f3fc/volumes" Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.752715 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.771142 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="112e9d28-233f-460a-856a-04778ef398d0" containerName="watcher-kuttl-api-log" containerID="cri-o://c4dda8c40d51bbf29e2f7410778e55955b4618b3737d50027ea25e8f58b49d7b" gracePeriod=30 Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.771273 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="112e9d28-233f-460a-856a-04778ef398d0" containerName="watcher-api" containerID="cri-o://83c648c206ea539fc89d990490f8e824b54447a8aa55baa4e667d59632ff11f0" gracePeriod=30 Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.785998 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.786271 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" containerName="watcher-kuttl-api-log" containerID="cri-o://5d45c714ff581fb6879ea02ad616cca0b2ad826bb90814c31be88025e79bd9d1" gracePeriod=30 Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.786431 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" containerName="watcher-api" containerID="cri-o://7611ababa19817f26008b86f91f3b122125c842ba489bc600b2cbc64153faf13" gracePeriod=30 Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.837639 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.837896 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="eb70af09-6df0-4675-ae4a-6b460676478e" containerName="watcher-applier" containerID="cri-o://e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a" gracePeriod=30 Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.858939 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzrqc\" (UniqueName: \"kubernetes.io/projected/e0b2a05f-5e36-4f26-b1df-f5ea3ecee959-kube-api-access-dzrqc\") pod \"watchertest-account-delete-2tdz9\" (UID: \"e0b2a05f-5e36-4f26-b1df-f5ea3ecee959\") " pod="watcher-kuttl-default/watchertest-account-delete-2tdz9" Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.960081 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzrqc\" (UniqueName: \"kubernetes.io/projected/e0b2a05f-5e36-4f26-b1df-f5ea3ecee959-kube-api-access-dzrqc\") pod \"watchertest-account-delete-2tdz9\" (UID: \"e0b2a05f-5e36-4f26-b1df-f5ea3ecee959\") " pod="watcher-kuttl-default/watchertest-account-delete-2tdz9" Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.992405 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzrqc\" (UniqueName: \"kubernetes.io/projected/e0b2a05f-5e36-4f26-b1df-f5ea3ecee959-kube-api-access-dzrqc\") pod \"watchertest-account-delete-2tdz9\" (UID: \"e0b2a05f-5e36-4f26-b1df-f5ea3ecee959\") " pod="watcher-kuttl-default/watchertest-account-delete-2tdz9" Dec 15 14:26:08 crc kubenswrapper[4794]: I1215 14:26:08.998994 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-2tdz9" Dec 15 14:26:09 crc kubenswrapper[4794]: I1215 14:26:09.233250 4794 generic.go:334] "Generic (PLEG): container finished" podID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" containerID="5d45c714ff581fb6879ea02ad616cca0b2ad826bb90814c31be88025e79bd9d1" exitCode=143 Dec 15 14:26:09 crc kubenswrapper[4794]: I1215 14:26:09.233628 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"a2a4eea6-4594-41ff-ba05-f09d845ccbe5","Type":"ContainerDied","Data":"5d45c714ff581fb6879ea02ad616cca0b2ad826bb90814c31be88025e79bd9d1"} Dec 15 14:26:09 crc kubenswrapper[4794]: I1215 14:26:09.252549 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"59dc6449-7d86-451a-aa7f-0e1e1c8022ea","Type":"ContainerStarted","Data":"ea1e919c26ef01fc86168431607046932295055cb1f0538e9e0f1d2655b85ce1"} Dec 15 14:26:09 crc kubenswrapper[4794]: I1215 14:26:09.253877 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:09 crc kubenswrapper[4794]: I1215 14:26:09.267895 4794 generic.go:334] "Generic (PLEG): container finished" podID="112e9d28-233f-460a-856a-04778ef398d0" containerID="c4dda8c40d51bbf29e2f7410778e55955b4618b3737d50027ea25e8f58b49d7b" exitCode=143 Dec 15 14:26:09 crc kubenswrapper[4794]: I1215 14:26:09.267953 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"112e9d28-233f-460a-856a-04778ef398d0","Type":"ContainerDied","Data":"c4dda8c40d51bbf29e2f7410778e55955b4618b3737d50027ea25e8f58b49d7b"} Dec 15 14:26:09 crc kubenswrapper[4794]: I1215 14:26:09.278335 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.097195678 podStartE2EDuration="10.278314183s" podCreationTimestamp="2025-12-15 14:25:59 +0000 UTC" firstStartedPulling="2025-12-15 14:25:59.85424687 +0000 UTC m=+1921.706269308" lastFinishedPulling="2025-12-15 14:26:08.035365335 +0000 UTC m=+1929.887387813" observedRunningTime="2025-12-15 14:26:09.276166902 +0000 UTC m=+1931.128189360" watchObservedRunningTime="2025-12-15 14:26:09.278314183 +0000 UTC m=+1931.130336641" Dec 15 14:26:09 crc kubenswrapper[4794]: I1215 14:26:09.596393 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-2tdz9"] Dec 15 14:26:09 crc kubenswrapper[4794]: W1215 14:26:09.602744 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0b2a05f_5e36_4f26_b1df_f5ea3ecee959.slice/crio-c6779a63ef2802f7b67506368b09d4980fabdaca07b266a16648925d5b2206fa WatchSource:0}: Error finding container c6779a63ef2802f7b67506368b09d4980fabdaca07b266a16648925d5b2206fa: Status 404 returned error can't find the container with id c6779a63ef2802f7b67506368b09d4980fabdaca07b266a16648925d5b2206fa Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.278801 4794 generic.go:334] "Generic (PLEG): container finished" podID="e0b2a05f-5e36-4f26-b1df-f5ea3ecee959" containerID="035d3c93a9e36bf323cc782fbdcf42b562074b46ccac5e77a88906f3d1d84cb9" exitCode=0 Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.279090 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-2tdz9" event={"ID":"e0b2a05f-5e36-4f26-b1df-f5ea3ecee959","Type":"ContainerDied","Data":"035d3c93a9e36bf323cc782fbdcf42b562074b46ccac5e77a88906f3d1d84cb9"} Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.279113 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-2tdz9" event={"ID":"e0b2a05f-5e36-4f26-b1df-f5ea3ecee959","Type":"ContainerStarted","Data":"c6779a63ef2802f7b67506368b09d4980fabdaca07b266a16648925d5b2206fa"} Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.284284 4794 generic.go:334] "Generic (PLEG): container finished" podID="112e9d28-233f-460a-856a-04778ef398d0" containerID="83c648c206ea539fc89d990490f8e824b54447a8aa55baa4e667d59632ff11f0" exitCode=0 Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.284346 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"112e9d28-233f-460a-856a-04778ef398d0","Type":"ContainerDied","Data":"83c648c206ea539fc89d990490f8e824b54447a8aa55baa4e667d59632ff11f0"} Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.287008 4794 generic.go:334] "Generic (PLEG): container finished" podID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" containerID="7611ababa19817f26008b86f91f3b122125c842ba489bc600b2cbc64153faf13" exitCode=0 Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.287054 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"a2a4eea6-4594-41ff-ba05-f09d845ccbe5","Type":"ContainerDied","Data":"7611ababa19817f26008b86f91f3b122125c842ba489bc600b2cbc64153faf13"} Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.288630 4794 generic.go:334] "Generic (PLEG): container finished" podID="052edc18-54e2-4ed4-8913-6a1288c0b12e" containerID="38ecefd4cee8659ac403a64c32995829abebe7e6d1637176048a08a5ed2e4cef" exitCode=0 Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.289549 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"052edc18-54e2-4ed4-8913-6a1288c0b12e","Type":"ContainerDied","Data":"38ecefd4cee8659ac403a64c32995829abebe7e6d1637176048a08a5ed2e4cef"} Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.289572 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"052edc18-54e2-4ed4-8913-6a1288c0b12e","Type":"ContainerDied","Data":"4b9ce2ba9e92202c2b521f70b6177b8e8cffd07fbd20eb58de1bd8e05bc7c1dc"} Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.289598 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b9ce2ba9e92202c2b521f70b6177b8e8cffd07fbd20eb58de1bd8e05bc7c1dc" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.307406 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.404899 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-custom-prometheus-ca\") pod \"052edc18-54e2-4ed4-8913-6a1288c0b12e\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.404953 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-combined-ca-bundle\") pod \"052edc18-54e2-4ed4-8913-6a1288c0b12e\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.474713 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "052edc18-54e2-4ed4-8913-6a1288c0b12e" (UID: "052edc18-54e2-4ed4-8913-6a1288c0b12e"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.475075 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "052edc18-54e2-4ed4-8913-6a1288c0b12e" (UID: "052edc18-54e2-4ed4-8913-6a1288c0b12e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.512259 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfpjc\" (UniqueName: \"kubernetes.io/projected/052edc18-54e2-4ed4-8913-6a1288c0b12e-kube-api-access-mfpjc\") pod \"052edc18-54e2-4ed4-8913-6a1288c0b12e\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.512306 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/052edc18-54e2-4ed4-8913-6a1288c0b12e-logs\") pod \"052edc18-54e2-4ed4-8913-6a1288c0b12e\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.512349 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-config-data\") pod \"052edc18-54e2-4ed4-8913-6a1288c0b12e\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.512366 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-cert-memcached-mtls\") pod \"052edc18-54e2-4ed4-8913-6a1288c0b12e\" (UID: \"052edc18-54e2-4ed4-8913-6a1288c0b12e\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.512752 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.512768 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.523626 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/052edc18-54e2-4ed4-8913-6a1288c0b12e-logs" (OuterVolumeSpecName: "logs") pod "052edc18-54e2-4ed4-8913-6a1288c0b12e" (UID: "052edc18-54e2-4ed4-8913-6a1288c0b12e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.535842 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/052edc18-54e2-4ed4-8913-6a1288c0b12e-kube-api-access-mfpjc" (OuterVolumeSpecName: "kube-api-access-mfpjc") pod "052edc18-54e2-4ed4-8913-6a1288c0b12e" (UID: "052edc18-54e2-4ed4-8913-6a1288c0b12e"). InnerVolumeSpecName "kube-api-access-mfpjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.568932 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-config-data" (OuterVolumeSpecName: "config-data") pod "052edc18-54e2-4ed4-8913-6a1288c0b12e" (UID: "052edc18-54e2-4ed4-8913-6a1288c0b12e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.582603 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.614563 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfpjc\" (UniqueName: \"kubernetes.io/projected/052edc18-54e2-4ed4-8913-6a1288c0b12e-kube-api-access-mfpjc\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.614621 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/052edc18-54e2-4ed4-8913-6a1288c0b12e-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.614635 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.626027 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.641253 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "052edc18-54e2-4ed4-8913-6a1288c0b12e" (UID: "052edc18-54e2-4ed4-8913-6a1288c0b12e"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.715923 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-combined-ca-bundle\") pod \"112e9d28-233f-460a-856a-04778ef398d0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.715995 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112e9d28-233f-460a-856a-04778ef398d0-logs\") pod \"112e9d28-233f-460a-856a-04778ef398d0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716049 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-custom-prometheus-ca\") pod \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716099 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-cert-memcached-mtls\") pod \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716161 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-config-data\") pod \"112e9d28-233f-460a-856a-04778ef398d0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716258 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-config-data\") pod \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716286 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-combined-ca-bundle\") pod \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716315 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-custom-prometheus-ca\") pod \"112e9d28-233f-460a-856a-04778ef398d0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716335 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-cert-memcached-mtls\") pod \"112e9d28-233f-460a-856a-04778ef398d0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716361 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-962lf\" (UniqueName: \"kubernetes.io/projected/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-kube-api-access-962lf\") pod \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716401 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-logs\") pod \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\" (UID: \"a2a4eea6-4594-41ff-ba05-f09d845ccbe5\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716439 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr87l\" (UniqueName: \"kubernetes.io/projected/112e9d28-233f-460a-856a-04778ef398d0-kube-api-access-pr87l\") pod \"112e9d28-233f-460a-856a-04778ef398d0\" (UID: \"112e9d28-233f-460a-856a-04778ef398d0\") " Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716929 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/052edc18-54e2-4ed4-8913-6a1288c0b12e-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.716983 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112e9d28-233f-460a-856a-04778ef398d0-logs" (OuterVolumeSpecName: "logs") pod "112e9d28-233f-460a-856a-04778ef398d0" (UID: "112e9d28-233f-460a-856a-04778ef398d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.724055 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-logs" (OuterVolumeSpecName: "logs") pod "a2a4eea6-4594-41ff-ba05-f09d845ccbe5" (UID: "a2a4eea6-4594-41ff-ba05-f09d845ccbe5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.741704 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112e9d28-233f-460a-856a-04778ef398d0-kube-api-access-pr87l" (OuterVolumeSpecName: "kube-api-access-pr87l") pod "112e9d28-233f-460a-856a-04778ef398d0" (UID: "112e9d28-233f-460a-856a-04778ef398d0"). InnerVolumeSpecName "kube-api-access-pr87l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.743722 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-kube-api-access-962lf" (OuterVolumeSpecName: "kube-api-access-962lf") pod "a2a4eea6-4594-41ff-ba05-f09d845ccbe5" (UID: "a2a4eea6-4594-41ff-ba05-f09d845ccbe5"). InnerVolumeSpecName "kube-api-access-962lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.801926 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "a2a4eea6-4594-41ff-ba05-f09d845ccbe5" (UID: "a2a4eea6-4594-41ff-ba05-f09d845ccbe5"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: E1215 14:26:10.802055 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:26:10 crc kubenswrapper[4794]: E1215 14:26:10.810700 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.818242 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112e9d28-233f-460a-856a-04778ef398d0-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.818278 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.818289 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-962lf\" (UniqueName: \"kubernetes.io/projected/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-kube-api-access-962lf\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.818299 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.818307 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr87l\" (UniqueName: \"kubernetes.io/projected/112e9d28-233f-460a-856a-04778ef398d0-kube-api-access-pr87l\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: E1215 14:26:10.828709 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 15 14:26:10 crc kubenswrapper[4794]: E1215 14:26:10.828791 4794 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="eb70af09-6df0-4675-ae4a-6b460676478e" containerName="watcher-applier" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.866658 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2a4eea6-4594-41ff-ba05-f09d845ccbe5" (UID: "a2a4eea6-4594-41ff-ba05-f09d845ccbe5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.874743 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "112e9d28-233f-460a-856a-04778ef398d0" (UID: "112e9d28-233f-460a-856a-04778ef398d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.897782 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-config-data" (OuterVolumeSpecName: "config-data") pod "112e9d28-233f-460a-856a-04778ef398d0" (UID: "112e9d28-233f-460a-856a-04778ef398d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.914018 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "112e9d28-233f-460a-856a-04778ef398d0" (UID: "112e9d28-233f-460a-856a-04778ef398d0"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.919604 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.919637 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.919649 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.919660 4794 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.966318 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "112e9d28-233f-460a-856a-04778ef398d0" (UID: "112e9d28-233f-460a-856a-04778ef398d0"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:10 crc kubenswrapper[4794]: I1215 14:26:10.969824 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-config-data" (OuterVolumeSpecName: "config-data") pod "a2a4eea6-4594-41ff-ba05-f09d845ccbe5" (UID: "a2a4eea6-4594-41ff-ba05-f09d845ccbe5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.017710 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "a2a4eea6-4594-41ff-ba05-f09d845ccbe5" (UID: "a2a4eea6-4594-41ff-ba05-f09d845ccbe5"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.020909 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.020940 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/112e9d28-233f-460a-856a-04778ef398d0-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.020955 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/a2a4eea6-4594-41ff-ba05-f09d845ccbe5-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.298204 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.298202 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"a2a4eea6-4594-41ff-ba05-f09d845ccbe5","Type":"ContainerDied","Data":"fa69ed245baee4569c56dde38d96906f7950b35374b96d427c8ba975c7bae7d5"} Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.298373 4794 scope.go:117] "RemoveContainer" containerID="7611ababa19817f26008b86f91f3b122125c842ba489bc600b2cbc64153faf13" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.300531 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.301377 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.310959 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"112e9d28-233f-460a-856a-04778ef398d0","Type":"ContainerDied","Data":"86f191a9d33be2a0881743f6839f1ad5ecca836a813593b5ce6eb66375a5547a"} Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.324099 4794 scope.go:117] "RemoveContainer" containerID="5d45c714ff581fb6879ea02ad616cca0b2ad826bb90814c31be88025e79bd9d1" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.349080 4794 scope.go:117] "RemoveContainer" containerID="83c648c206ea539fc89d990490f8e824b54447a8aa55baa4e667d59632ff11f0" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.355852 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.362085 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.370391 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.377652 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.384652 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.389718 4794 scope.go:117] "RemoveContainer" containerID="c4dda8c40d51bbf29e2f7410778e55955b4618b3737d50027ea25e8f58b49d7b" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.399202 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.632982 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-2tdz9" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.742004 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzrqc\" (UniqueName: \"kubernetes.io/projected/e0b2a05f-5e36-4f26-b1df-f5ea3ecee959-kube-api-access-dzrqc\") pod \"e0b2a05f-5e36-4f26-b1df-f5ea3ecee959\" (UID: \"e0b2a05f-5e36-4f26-b1df-f5ea3ecee959\") " Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.754800 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b2a05f-5e36-4f26-b1df-f5ea3ecee959-kube-api-access-dzrqc" (OuterVolumeSpecName: "kube-api-access-dzrqc") pod "e0b2a05f-5e36-4f26-b1df-f5ea3ecee959" (UID: "e0b2a05f-5e36-4f26-b1df-f5ea3ecee959"). InnerVolumeSpecName "kube-api-access-dzrqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:26:11 crc kubenswrapper[4794]: I1215 14:26:11.845037 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzrqc\" (UniqueName: \"kubernetes.io/projected/e0b2a05f-5e36-4f26-b1df-f5ea3ecee959-kube-api-access-dzrqc\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:12 crc kubenswrapper[4794]: I1215 14:26:12.319497 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-2tdz9" Dec 15 14:26:12 crc kubenswrapper[4794]: I1215 14:26:12.319481 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-2tdz9" event={"ID":"e0b2a05f-5e36-4f26-b1df-f5ea3ecee959","Type":"ContainerDied","Data":"c6779a63ef2802f7b67506368b09d4980fabdaca07b266a16648925d5b2206fa"} Dec 15 14:26:12 crc kubenswrapper[4794]: I1215 14:26:12.320650 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6779a63ef2802f7b67506368b09d4980fabdaca07b266a16648925d5b2206fa" Dec 15 14:26:12 crc kubenswrapper[4794]: I1215 14:26:12.747062 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="052edc18-54e2-4ed4-8913-6a1288c0b12e" path="/var/lib/kubelet/pods/052edc18-54e2-4ed4-8913-6a1288c0b12e/volumes" Dec 15 14:26:12 crc kubenswrapper[4794]: I1215 14:26:12.748646 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="112e9d28-233f-460a-856a-04778ef398d0" path="/var/lib/kubelet/pods/112e9d28-233f-460a-856a-04778ef398d0/volumes" Dec 15 14:26:12 crc kubenswrapper[4794]: I1215 14:26:12.749370 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" path="/var/lib/kubelet/pods/a2a4eea6-4594-41ff-ba05-f09d845ccbe5/volumes" Dec 15 14:26:13 crc kubenswrapper[4794]: I1215 14:26:13.221925 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:26:13 crc kubenswrapper[4794]: I1215 14:26:13.222269 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="ceilometer-central-agent" containerID="cri-o://5fa16a04017325a7c76ff81534dfa9b060a01d93f3b9402c2f67550056ae38ec" gracePeriod=30 Dec 15 14:26:13 crc kubenswrapper[4794]: I1215 14:26:13.222324 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="sg-core" containerID="cri-o://ef222b6613dc815435671964e0e5c8e13e81ab64a7ebfca5e243b371d2aeda35" gracePeriod=30 Dec 15 14:26:13 crc kubenswrapper[4794]: I1215 14:26:13.222381 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="ceilometer-notification-agent" containerID="cri-o://8071aa6ef5b48185e1f50da4ba6f34a0fbff42c36283206c02b64c92340f8ed2" gracePeriod=30 Dec 15 14:26:13 crc kubenswrapper[4794]: I1215 14:26:13.222414 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="proxy-httpd" containerID="cri-o://ea1e919c26ef01fc86168431607046932295055cb1f0538e9e0f1d2655b85ce1" gracePeriod=30 Dec 15 14:26:13 crc kubenswrapper[4794]: I1215 14:26:13.702344 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-nxgrl"] Dec 15 14:26:13 crc kubenswrapper[4794]: I1215 14:26:13.710735 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-nxgrl"] Dec 15 14:26:13 crc kubenswrapper[4794]: I1215 14:26:13.718875 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-2tdz9"] Dec 15 14:26:13 crc kubenswrapper[4794]: I1215 14:26:13.727668 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-gbb2p"] Dec 15 14:26:13 crc kubenswrapper[4794]: I1215 14:26:13.734309 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-2tdz9"] Dec 15 14:26:13 crc kubenswrapper[4794]: I1215 14:26:13.740801 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-gbb2p"] Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.322866 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.338267 4794 generic.go:334] "Generic (PLEG): container finished" podID="eb70af09-6df0-4675-ae4a-6b460676478e" containerID="e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a" exitCode=0 Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.338324 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"eb70af09-6df0-4675-ae4a-6b460676478e","Type":"ContainerDied","Data":"e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a"} Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.338350 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"eb70af09-6df0-4675-ae4a-6b460676478e","Type":"ContainerDied","Data":"91cbc27823e9fe5869d8566a7833cbf5c383776e5fb3c264b7fc5935dd9abdc4"} Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.338366 4794 scope.go:117] "RemoveContainer" containerID="e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.338473 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.344211 4794 generic.go:334] "Generic (PLEG): container finished" podID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerID="ea1e919c26ef01fc86168431607046932295055cb1f0538e9e0f1d2655b85ce1" exitCode=0 Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.344235 4794 generic.go:334] "Generic (PLEG): container finished" podID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerID="ef222b6613dc815435671964e0e5c8e13e81ab64a7ebfca5e243b371d2aeda35" exitCode=2 Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.344244 4794 generic.go:334] "Generic (PLEG): container finished" podID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerID="8071aa6ef5b48185e1f50da4ba6f34a0fbff42c36283206c02b64c92340f8ed2" exitCode=0 Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.344250 4794 generic.go:334] "Generic (PLEG): container finished" podID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerID="5fa16a04017325a7c76ff81534dfa9b060a01d93f3b9402c2f67550056ae38ec" exitCode=0 Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.344253 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"59dc6449-7d86-451a-aa7f-0e1e1c8022ea","Type":"ContainerDied","Data":"ea1e919c26ef01fc86168431607046932295055cb1f0538e9e0f1d2655b85ce1"} Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.344300 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"59dc6449-7d86-451a-aa7f-0e1e1c8022ea","Type":"ContainerDied","Data":"ef222b6613dc815435671964e0e5c8e13e81ab64a7ebfca5e243b371d2aeda35"} Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.344313 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"59dc6449-7d86-451a-aa7f-0e1e1c8022ea","Type":"ContainerDied","Data":"8071aa6ef5b48185e1f50da4ba6f34a0fbff42c36283206c02b64c92340f8ed2"} Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.344325 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"59dc6449-7d86-451a-aa7f-0e1e1c8022ea","Type":"ContainerDied","Data":"5fa16a04017325a7c76ff81534dfa9b060a01d93f3b9402c2f67550056ae38ec"} Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.384371 4794 scope.go:117] "RemoveContainer" containerID="e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a" Dec 15 14:26:14 crc kubenswrapper[4794]: E1215 14:26:14.384955 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a\": container with ID starting with e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a not found: ID does not exist" containerID="e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.385014 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a"} err="failed to get container status \"e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a\": rpc error: code = NotFound desc = could not find container \"e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a\": container with ID starting with e50844daf714d803b6bf32bc89d4a0495a76c7a52987e8a52645c228a053102a not found: ID does not exist" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.392873 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-cert-memcached-mtls\") pod \"eb70af09-6df0-4675-ae4a-6b460676478e\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.392923 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-combined-ca-bundle\") pod \"eb70af09-6df0-4675-ae4a-6b460676478e\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.393017 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcfwc\" (UniqueName: \"kubernetes.io/projected/eb70af09-6df0-4675-ae4a-6b460676478e-kube-api-access-rcfwc\") pod \"eb70af09-6df0-4675-ae4a-6b460676478e\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.393090 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb70af09-6df0-4675-ae4a-6b460676478e-logs\") pod \"eb70af09-6df0-4675-ae4a-6b460676478e\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.393110 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-config-data\") pod \"eb70af09-6df0-4675-ae4a-6b460676478e\" (UID: \"eb70af09-6df0-4675-ae4a-6b460676478e\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.393925 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb70af09-6df0-4675-ae4a-6b460676478e-logs" (OuterVolumeSpecName: "logs") pod "eb70af09-6df0-4675-ae4a-6b460676478e" (UID: "eb70af09-6df0-4675-ae4a-6b460676478e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.412793 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb70af09-6df0-4675-ae4a-6b460676478e-kube-api-access-rcfwc" (OuterVolumeSpecName: "kube-api-access-rcfwc") pod "eb70af09-6df0-4675-ae4a-6b460676478e" (UID: "eb70af09-6df0-4675-ae4a-6b460676478e"). InnerVolumeSpecName "kube-api-access-rcfwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.460818 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb70af09-6df0-4675-ae4a-6b460676478e" (UID: "eb70af09-6df0-4675-ae4a-6b460676478e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.495806 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "eb70af09-6df0-4675-ae4a-6b460676478e" (UID: "eb70af09-6df0-4675-ae4a-6b460676478e"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.507362 4794 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb70af09-6df0-4675-ae4a-6b460676478e-logs\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.507399 4794 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.507412 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.507426 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcfwc\" (UniqueName: \"kubernetes.io/projected/eb70af09-6df0-4675-ae4a-6b460676478e-kube-api-access-rcfwc\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.511069 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-config-data" (OuterVolumeSpecName: "config-data") pod "eb70af09-6df0-4675-ae4a-6b460676478e" (UID: "eb70af09-6df0-4675-ae4a-6b460676478e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.608368 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb70af09-6df0-4675-ae4a-6b460676478e-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.669335 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.680538 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.681774 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.749023 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fe66d29-318e-409e-8ff0-d5bfafa2d782" path="/var/lib/kubelet/pods/0fe66d29-318e-409e-8ff0-d5bfafa2d782/volumes" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.749888 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43a4772e-0b90-4c8f-9f25-a8aa83ea34d7" path="/var/lib/kubelet/pods/43a4772e-0b90-4c8f-9f25-a8aa83ea34d7/volumes" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.750482 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b2a05f-5e36-4f26-b1df-f5ea3ecee959" path="/var/lib/kubelet/pods/e0b2a05f-5e36-4f26-b1df-f5ea3ecee959/volumes" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.751330 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb70af09-6df0-4675-ae4a-6b460676478e" path="/var/lib/kubelet/pods/eb70af09-6df0-4675-ae4a-6b460676478e/volumes" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.810541 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-log-httpd\") pod \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.810641 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh2d7\" (UniqueName: \"kubernetes.io/projected/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-kube-api-access-kh2d7\") pod \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.810986 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "59dc6449-7d86-451a-aa7f-0e1e1c8022ea" (UID: "59dc6449-7d86-451a-aa7f-0e1e1c8022ea"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.811055 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-config-data\") pod \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.811089 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-sg-core-conf-yaml\") pod \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.811138 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-scripts\") pod \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.811171 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-ceilometer-tls-certs\") pod \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.811209 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-combined-ca-bundle\") pod \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.811250 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-run-httpd\") pod \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\" (UID: \"59dc6449-7d86-451a-aa7f-0e1e1c8022ea\") " Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.811886 4794 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.811982 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "59dc6449-7d86-451a-aa7f-0e1e1c8022ea" (UID: "59dc6449-7d86-451a-aa7f-0e1e1c8022ea"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.815673 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-scripts" (OuterVolumeSpecName: "scripts") pod "59dc6449-7d86-451a-aa7f-0e1e1c8022ea" (UID: "59dc6449-7d86-451a-aa7f-0e1e1c8022ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.816921 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-kube-api-access-kh2d7" (OuterVolumeSpecName: "kube-api-access-kh2d7") pod "59dc6449-7d86-451a-aa7f-0e1e1c8022ea" (UID: "59dc6449-7d86-451a-aa7f-0e1e1c8022ea"). InnerVolumeSpecName "kube-api-access-kh2d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.837124 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "59dc6449-7d86-451a-aa7f-0e1e1c8022ea" (UID: "59dc6449-7d86-451a-aa7f-0e1e1c8022ea"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.859207 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "59dc6449-7d86-451a-aa7f-0e1e1c8022ea" (UID: "59dc6449-7d86-451a-aa7f-0e1e1c8022ea"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.879939 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59dc6449-7d86-451a-aa7f-0e1e1c8022ea" (UID: "59dc6449-7d86-451a-aa7f-0e1e1c8022ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.899600 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-config-data" (OuterVolumeSpecName: "config-data") pod "59dc6449-7d86-451a-aa7f-0e1e1c8022ea" (UID: "59dc6449-7d86-451a-aa7f-0e1e1c8022ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.913724 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh2d7\" (UniqueName: \"kubernetes.io/projected/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-kube-api-access-kh2d7\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.914305 4794 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.914367 4794 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.914427 4794 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-scripts\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.914481 4794 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.914540 4794 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:14 crc kubenswrapper[4794]: I1215 14:26:14.914644 4794 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59dc6449-7d86-451a-aa7f-0e1e1c8022ea-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.374488 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"59dc6449-7d86-451a-aa7f-0e1e1c8022ea","Type":"ContainerDied","Data":"c164c07b86ecb939ba6211125a5c380a0023a81eabc19ca98d9be6b42725922b"} Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.374534 4794 scope.go:117] "RemoveContainer" containerID="ea1e919c26ef01fc86168431607046932295055cb1f0538e9e0f1d2655b85ce1" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.374623 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.428085 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.430401 4794 scope.go:117] "RemoveContainer" containerID="ef222b6613dc815435671964e0e5c8e13e81ab64a7ebfca5e243b371d2aeda35" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.445898 4794 scope.go:117] "RemoveContainer" containerID="8071aa6ef5b48185e1f50da4ba6f34a0fbff42c36283206c02b64c92340f8ed2" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.446439 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.472379 4794 scope.go:117] "RemoveContainer" containerID="5fa16a04017325a7c76ff81534dfa9b060a01d93f3b9402c2f67550056ae38ec" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.473457 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:26:15 crc kubenswrapper[4794]: E1215 14:26:15.473819 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b2a05f-5e36-4f26-b1df-f5ea3ecee959" containerName="mariadb-account-delete" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.473865 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b2a05f-5e36-4f26-b1df-f5ea3ecee959" containerName="mariadb-account-delete" Dec 15 14:26:15 crc kubenswrapper[4794]: E1215 14:26:15.473883 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb70af09-6df0-4675-ae4a-6b460676478e" containerName="watcher-applier" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.473891 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb70af09-6df0-4675-ae4a-6b460676478e" containerName="watcher-applier" Dec 15 14:26:15 crc kubenswrapper[4794]: E1215 14:26:15.473906 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112e9d28-233f-460a-856a-04778ef398d0" containerName="watcher-api" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.473914 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="112e9d28-233f-460a-856a-04778ef398d0" containerName="watcher-api" Dec 15 14:26:15 crc kubenswrapper[4794]: E1215 14:26:15.473924 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052edc18-54e2-4ed4-8913-6a1288c0b12e" containerName="watcher-decision-engine" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.473932 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="052edc18-54e2-4ed4-8913-6a1288c0b12e" containerName="watcher-decision-engine" Dec 15 14:26:15 crc kubenswrapper[4794]: E1215 14:26:15.480849 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112e9d28-233f-460a-856a-04778ef398d0" containerName="watcher-kuttl-api-log" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.480890 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="112e9d28-233f-460a-856a-04778ef398d0" containerName="watcher-kuttl-api-log" Dec 15 14:26:15 crc kubenswrapper[4794]: E1215 14:26:15.480926 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="sg-core" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.480934 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="sg-core" Dec 15 14:26:15 crc kubenswrapper[4794]: E1215 14:26:15.480946 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="ceilometer-notification-agent" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.480953 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="ceilometer-notification-agent" Dec 15 14:26:15 crc kubenswrapper[4794]: E1215 14:26:15.480981 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="ceilometer-central-agent" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.480988 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="ceilometer-central-agent" Dec 15 14:26:15 crc kubenswrapper[4794]: E1215 14:26:15.480997 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" containerName="watcher-api" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481005 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" containerName="watcher-api" Dec 15 14:26:15 crc kubenswrapper[4794]: E1215 14:26:15.481017 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="proxy-httpd" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481024 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="proxy-httpd" Dec 15 14:26:15 crc kubenswrapper[4794]: E1215 14:26:15.481038 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" containerName="watcher-kuttl-api-log" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481045 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" containerName="watcher-kuttl-api-log" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481352 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="112e9d28-233f-460a-856a-04778ef398d0" containerName="watcher-api" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481370 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" containerName="watcher-api" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481383 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="sg-core" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481404 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b2a05f-5e36-4f26-b1df-f5ea3ecee959" containerName="mariadb-account-delete" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481417 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="ceilometer-central-agent" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481430 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="112e9d28-233f-460a-856a-04778ef398d0" containerName="watcher-kuttl-api-log" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481441 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="proxy-httpd" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481453 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2a4eea6-4594-41ff-ba05-f09d845ccbe5" containerName="watcher-kuttl-api-log" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481461 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb70af09-6df0-4675-ae4a-6b460676478e" containerName="watcher-applier" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481473 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="052edc18-54e2-4ed4-8913-6a1288c0b12e" containerName="watcher-decision-engine" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.481483 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" containerName="ceilometer-notification-agent" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.483090 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.486836 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.487180 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.487377 4794 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.507900 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.531939 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-config-data\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.532008 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.532082 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-scripts\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.532105 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-run-httpd\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.532124 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.532189 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gnrb\" (UniqueName: \"kubernetes.io/projected/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-kube-api-access-7gnrb\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.532222 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.532274 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-log-httpd\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.640495 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-config-data\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.640563 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.640659 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-scripts\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.640692 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-run-httpd\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.640717 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.640760 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gnrb\" (UniqueName: \"kubernetes.io/projected/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-kube-api-access-7gnrb\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.640795 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.640852 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-log-httpd\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.641436 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-run-httpd\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.641863 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-log-httpd\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.644436 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.645133 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-config-data\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.645482 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.646977 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.653075 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-scripts\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.671534 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gnrb\" (UniqueName: \"kubernetes.io/projected/d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2-kube-api-access-7gnrb\") pod \"ceilometer-0\" (UID: \"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2\") " pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:15 crc kubenswrapper[4794]: I1215 14:26:15.808359 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:16 crc kubenswrapper[4794]: I1215 14:26:16.255762 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Dec 15 14:26:16 crc kubenswrapper[4794]: I1215 14:26:16.384221 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2","Type":"ContainerStarted","Data":"c9776174142e8efadce0762e05f5b3a029340d90e4017ff06f41cf07c0662c15"} Dec 15 14:26:16 crc kubenswrapper[4794]: I1215 14:26:16.747709 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59dc6449-7d86-451a-aa7f-0e1e1c8022ea" path="/var/lib/kubelet/pods/59dc6449-7d86-451a-aa7f-0e1e1c8022ea/volumes" Dec 15 14:26:17 crc kubenswrapper[4794]: I1215 14:26:17.396479 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2","Type":"ContainerStarted","Data":"aefa6c29feb1b009bb7022cdc21b44620d471d5383e81d047d49c73b65b3a6ac"} Dec 15 14:26:18 crc kubenswrapper[4794]: I1215 14:26:18.408497 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2","Type":"ContainerStarted","Data":"aaa8b29a168b458a2ac8fcbd73c23ce75d8629b1e6f08ec7724f92e7c5188491"} Dec 15 14:26:19 crc kubenswrapper[4794]: I1215 14:26:19.420133 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2","Type":"ContainerStarted","Data":"df2a8b721a9b5809645a90ddf3a36928cb93ac963d1d55fe36d26103de9614fa"} Dec 15 14:26:21 crc kubenswrapper[4794]: I1215 14:26:21.439459 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2","Type":"ContainerStarted","Data":"1764bb72d573d531ec741e68fdbc330d9c6862e674e8a3b352381cb819559ba9"} Dec 15 14:26:21 crc kubenswrapper[4794]: I1215 14:26:21.440251 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:21 crc kubenswrapper[4794]: I1215 14:26:21.469103 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.37225551 podStartE2EDuration="6.469079494s" podCreationTimestamp="2025-12-15 14:26:15 +0000 UTC" firstStartedPulling="2025-12-15 14:26:16.267924245 +0000 UTC m=+1938.119946693" lastFinishedPulling="2025-12-15 14:26:20.364748219 +0000 UTC m=+1942.216770677" observedRunningTime="2025-12-15 14:26:21.462892739 +0000 UTC m=+1943.314915197" watchObservedRunningTime="2025-12-15 14:26:21.469079494 +0000 UTC m=+1943.321101932" Dec 15 14:26:28 crc kubenswrapper[4794]: I1215 14:26:28.025059 4794 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="60bdf2da-411f-483d-a415-477200c358e2" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.217:3000/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.079098 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-695j6/must-gather-sh5td"] Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.081064 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-695j6/must-gather-sh5td" Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.082807 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-695j6"/"default-dockercfg-d7thn" Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.082997 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-695j6"/"kube-root-ca.crt" Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.093147 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-695j6/must-gather-sh5td"] Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.097724 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-695j6"/"openshift-service-ca.crt" Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.228225 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/656e9896-c532-43f3-bfa5-5009102e1cfa-must-gather-output\") pod \"must-gather-sh5td\" (UID: \"656e9896-c532-43f3-bfa5-5009102e1cfa\") " pod="openshift-must-gather-695j6/must-gather-sh5td" Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.228544 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjf2s\" (UniqueName: \"kubernetes.io/projected/656e9896-c532-43f3-bfa5-5009102e1cfa-kube-api-access-sjf2s\") pod \"must-gather-sh5td\" (UID: \"656e9896-c532-43f3-bfa5-5009102e1cfa\") " pod="openshift-must-gather-695j6/must-gather-sh5td" Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.330034 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjf2s\" (UniqueName: \"kubernetes.io/projected/656e9896-c532-43f3-bfa5-5009102e1cfa-kube-api-access-sjf2s\") pod \"must-gather-sh5td\" (UID: \"656e9896-c532-43f3-bfa5-5009102e1cfa\") " pod="openshift-must-gather-695j6/must-gather-sh5td" Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.330129 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/656e9896-c532-43f3-bfa5-5009102e1cfa-must-gather-output\") pod \"must-gather-sh5td\" (UID: \"656e9896-c532-43f3-bfa5-5009102e1cfa\") " pod="openshift-must-gather-695j6/must-gather-sh5td" Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.330727 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/656e9896-c532-43f3-bfa5-5009102e1cfa-must-gather-output\") pod \"must-gather-sh5td\" (UID: \"656e9896-c532-43f3-bfa5-5009102e1cfa\") " pod="openshift-must-gather-695j6/must-gather-sh5td" Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.352232 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjf2s\" (UniqueName: \"kubernetes.io/projected/656e9896-c532-43f3-bfa5-5009102e1cfa-kube-api-access-sjf2s\") pod \"must-gather-sh5td\" (UID: \"656e9896-c532-43f3-bfa5-5009102e1cfa\") " pod="openshift-must-gather-695j6/must-gather-sh5td" Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.397954 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-695j6/must-gather-sh5td" Dec 15 14:26:38 crc kubenswrapper[4794]: W1215 14:26:38.837678 4794 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod656e9896_c532_43f3_bfa5_5009102e1cfa.slice/crio-e6c1bb0ad81c0e47404022cc545a42ed24551545e9e2f58140de4f89deba1342 WatchSource:0}: Error finding container e6c1bb0ad81c0e47404022cc545a42ed24551545e9e2f58140de4f89deba1342: Status 404 returned error can't find the container with id e6c1bb0ad81c0e47404022cc545a42ed24551545e9e2f58140de4f89deba1342 Dec 15 14:26:38 crc kubenswrapper[4794]: I1215 14:26:38.851876 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-695j6/must-gather-sh5td"] Dec 15 14:26:39 crc kubenswrapper[4794]: I1215 14:26:39.598503 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-695j6/must-gather-sh5td" event={"ID":"656e9896-c532-43f3-bfa5-5009102e1cfa","Type":"ContainerStarted","Data":"e6c1bb0ad81c0e47404022cc545a42ed24551545e9e2f58140de4f89deba1342"} Dec 15 14:26:45 crc kubenswrapper[4794]: I1215 14:26:45.815478 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Dec 15 14:26:49 crc kubenswrapper[4794]: I1215 14:26:49.695634 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-695j6/must-gather-sh5td" event={"ID":"656e9896-c532-43f3-bfa5-5009102e1cfa","Type":"ContainerStarted","Data":"059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44"} Dec 15 14:26:49 crc kubenswrapper[4794]: I1215 14:26:49.696092 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-695j6/must-gather-sh5td" event={"ID":"656e9896-c532-43f3-bfa5-5009102e1cfa","Type":"ContainerStarted","Data":"61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674"} Dec 15 14:26:49 crc kubenswrapper[4794]: I1215 14:26:49.718135 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-695j6/must-gather-sh5td" podStartSLOduration=1.937879214 podStartE2EDuration="11.718109973s" podCreationTimestamp="2025-12-15 14:26:38 +0000 UTC" firstStartedPulling="2025-12-15 14:26:38.840166196 +0000 UTC m=+1960.692188634" lastFinishedPulling="2025-12-15 14:26:48.620396955 +0000 UTC m=+1970.472419393" observedRunningTime="2025-12-15 14:26:49.713748949 +0000 UTC m=+1971.565771407" watchObservedRunningTime="2025-12-15 14:26:49.718109973 +0000 UTC m=+1971.570132411" Dec 15 14:27:08 crc kubenswrapper[4794]: I1215 14:27:08.197400 4794 scope.go:117] "RemoveContainer" containerID="522f6336f89b5be0536f8116bd501ff2b25b37104204f5d8a1aaac54241e7f4f" Dec 15 14:27:24 crc kubenswrapper[4794]: I1215 14:27:24.534573 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:27:24 crc kubenswrapper[4794]: I1215 14:27:24.535137 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:27:45 crc kubenswrapper[4794]: I1215 14:27:45.871995 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sq2vb"] Dec 15 14:27:45 crc kubenswrapper[4794]: I1215 14:27:45.875852 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:45 crc kubenswrapper[4794]: I1215 14:27:45.885946 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sq2vb"] Dec 15 14:27:46 crc kubenswrapper[4794]: I1215 14:27:46.026861 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-utilities\") pod \"redhat-operators-sq2vb\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:46 crc kubenswrapper[4794]: I1215 14:27:46.027327 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-catalog-content\") pod \"redhat-operators-sq2vb\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:46 crc kubenswrapper[4794]: I1215 14:27:46.027361 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb5rn\" (UniqueName: \"kubernetes.io/projected/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-kube-api-access-tb5rn\") pod \"redhat-operators-sq2vb\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:46 crc kubenswrapper[4794]: I1215 14:27:46.129119 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-catalog-content\") pod \"redhat-operators-sq2vb\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:46 crc kubenswrapper[4794]: I1215 14:27:46.129175 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb5rn\" (UniqueName: \"kubernetes.io/projected/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-kube-api-access-tb5rn\") pod \"redhat-operators-sq2vb\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:46 crc kubenswrapper[4794]: I1215 14:27:46.129260 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-utilities\") pod \"redhat-operators-sq2vb\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:46 crc kubenswrapper[4794]: I1215 14:27:46.129810 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-utilities\") pod \"redhat-operators-sq2vb\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:46 crc kubenswrapper[4794]: I1215 14:27:46.129806 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-catalog-content\") pod \"redhat-operators-sq2vb\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:46 crc kubenswrapper[4794]: I1215 14:27:46.150232 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb5rn\" (UniqueName: \"kubernetes.io/projected/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-kube-api-access-tb5rn\") pod \"redhat-operators-sq2vb\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:46 crc kubenswrapper[4794]: I1215 14:27:46.245391 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:46 crc kubenswrapper[4794]: I1215 14:27:46.705323 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sq2vb"] Dec 15 14:27:47 crc kubenswrapper[4794]: I1215 14:27:47.213135 4794 generic.go:334] "Generic (PLEG): container finished" podID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerID="9625062190f72b286dd1698d0621d56656121310ba425a12af42cc73dc14aeca" exitCode=0 Dec 15 14:27:47 crc kubenswrapper[4794]: I1215 14:27:47.213240 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2vb" event={"ID":"ecebfd8d-157a-4a2d-8f8d-902c6a37483e","Type":"ContainerDied","Data":"9625062190f72b286dd1698d0621d56656121310ba425a12af42cc73dc14aeca"} Dec 15 14:27:47 crc kubenswrapper[4794]: I1215 14:27:47.213398 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2vb" event={"ID":"ecebfd8d-157a-4a2d-8f8d-902c6a37483e","Type":"ContainerStarted","Data":"ad2f76aceb9aaf095967f8880979da57fdc4793cafd02cb469b711a32788407c"} Dec 15 14:27:47 crc kubenswrapper[4794]: I1215 14:27:47.214946 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 15 14:27:48 crc kubenswrapper[4794]: I1215 14:27:48.221881 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2vb" event={"ID":"ecebfd8d-157a-4a2d-8f8d-902c6a37483e","Type":"ContainerStarted","Data":"53b4960b0332dd3ef703a1c8ac4faaadca5f5c52a11d81c1aa46333a16169ee3"} Dec 15 14:27:49 crc kubenswrapper[4794]: I1215 14:27:49.235912 4794 generic.go:334] "Generic (PLEG): container finished" podID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerID="53b4960b0332dd3ef703a1c8ac4faaadca5f5c52a11d81c1aa46333a16169ee3" exitCode=0 Dec 15 14:27:49 crc kubenswrapper[4794]: I1215 14:27:49.235949 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2vb" event={"ID":"ecebfd8d-157a-4a2d-8f8d-902c6a37483e","Type":"ContainerDied","Data":"53b4960b0332dd3ef703a1c8ac4faaadca5f5c52a11d81c1aa46333a16169ee3"} Dec 15 14:27:50 crc kubenswrapper[4794]: I1215 14:27:50.245651 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2vb" event={"ID":"ecebfd8d-157a-4a2d-8f8d-902c6a37483e","Type":"ContainerStarted","Data":"ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198"} Dec 15 14:27:50 crc kubenswrapper[4794]: I1215 14:27:50.279345 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sq2vb" podStartSLOduration=2.709109303 podStartE2EDuration="5.279320614s" podCreationTimestamp="2025-12-15 14:27:45 +0000 UTC" firstStartedPulling="2025-12-15 14:27:47.214672824 +0000 UTC m=+2029.066695262" lastFinishedPulling="2025-12-15 14:27:49.784884135 +0000 UTC m=+2031.636906573" observedRunningTime="2025-12-15 14:27:50.272837811 +0000 UTC m=+2032.124860269" watchObservedRunningTime="2025-12-15 14:27:50.279320614 +0000 UTC m=+2032.131343082" Dec 15 14:27:53 crc kubenswrapper[4794]: I1215 14:27:53.157481 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w_86b0d3e1-9b95-41f8-8fa9-27ae9efb3512/util/0.log" Dec 15 14:27:53 crc kubenswrapper[4794]: I1215 14:27:53.157499 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w_86b0d3e1-9b95-41f8-8fa9-27ae9efb3512/util/0.log" Dec 15 14:27:53 crc kubenswrapper[4794]: I1215 14:27:53.158546 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w_86b0d3e1-9b95-41f8-8fa9-27ae9efb3512/pull/0.log" Dec 15 14:27:53 crc kubenswrapper[4794]: I1215 14:27:53.688030 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w_86b0d3e1-9b95-41f8-8fa9-27ae9efb3512/pull/0.log" Dec 15 14:27:53 crc kubenswrapper[4794]: I1215 14:27:53.732982 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w_86b0d3e1-9b95-41f8-8fa9-27ae9efb3512/pull/0.log" Dec 15 14:27:53 crc kubenswrapper[4794]: I1215 14:27:53.854294 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w_86b0d3e1-9b95-41f8-8fa9-27ae9efb3512/extract/0.log" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.042002 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_71cbbb4fcc98efbe5d7e254604034ae8e7ec0c88ddb1c1ba1eb25d6c0d8wb8w_86b0d3e1-9b95-41f8-8fa9-27ae9efb3512/util/0.log" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.119606 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz_91c52bd8-479c-4fee-bd2c-f46432f75395/util/0.log" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.154272 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz_91c52bd8-479c-4fee-bd2c-f46432f75395/pull/0.log" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.169472 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz_91c52bd8-479c-4fee-bd2c-f46432f75395/util/0.log" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.282097 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz_91c52bd8-479c-4fee-bd2c-f46432f75395/pull/0.log" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.479524 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz_91c52bd8-479c-4fee-bd2c-f46432f75395/util/0.log" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.509179 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz_91c52bd8-479c-4fee-bd2c-f46432f75395/extract/0.log" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.534368 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.534440 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.765965 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7efa6f8b31df912e9df1bd03de07bf34fef6a0eee2eb034e97cb6bebcd9sxgz_91c52bd8-479c-4fee-bd2c-f46432f75395/pull/0.log" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.827381 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-bb565c8dd-qt4th_070970b1-bb19-4aa4-b544-241064874029/manager/0.log" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.832523 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-bb565c8dd-qt4th_070970b1-bb19-4aa4-b544-241064874029/kube-rbac-proxy/0.log" Dec 15 14:27:54 crc kubenswrapper[4794]: I1215 14:27:54.975251 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-669b58f65-c97jv_9f9e1543-36d5-427f-b1cc-3eb9baa9d826/kube-rbac-proxy/0.log" Dec 15 14:27:55 crc kubenswrapper[4794]: I1215 14:27:55.058407 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-669b58f65-c97jv_9f9e1543-36d5-427f-b1cc-3eb9baa9d826/manager/0.log" Dec 15 14:27:55 crc kubenswrapper[4794]: I1215 14:27:55.156339 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-69977bdf55-4tl5z_02e2d0b6-2d72-4aa5-9a71-2c0fb9fcf262/kube-rbac-proxy/0.log" Dec 15 14:27:55 crc kubenswrapper[4794]: I1215 14:27:55.211674 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-69977bdf55-4tl5z_02e2d0b6-2d72-4aa5-9a71-2c0fb9fcf262/manager/0.log" Dec 15 14:27:55 crc kubenswrapper[4794]: I1215 14:27:55.313321 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5847f67c56-vg9n8_88f32422-f5bd-4fd8-85d1-ff2d6ccc1633/kube-rbac-proxy/0.log" Dec 15 14:27:55 crc kubenswrapper[4794]: I1215 14:27:55.454964 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5847f67c56-vg9n8_88f32422-f5bd-4fd8-85d1-ff2d6ccc1633/manager/0.log" Dec 15 14:27:55 crc kubenswrapper[4794]: I1215 14:27:55.520457 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7b45cd6d68-dsxzg_488f9339-bc6f-419a-acdc-c4601f5f0d04/manager/0.log" Dec 15 14:27:55 crc kubenswrapper[4794]: I1215 14:27:55.583707 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7b45cd6d68-dsxzg_488f9339-bc6f-419a-acdc-c4601f5f0d04/kube-rbac-proxy/0.log" Dec 15 14:27:55 crc kubenswrapper[4794]: I1215 14:27:55.685976 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6985cf78fb-zpwbb_e573b9b7-1b6c-40d6-93e0-c9103105034d/kube-rbac-proxy/0.log" Dec 15 14:27:55 crc kubenswrapper[4794]: I1215 14:27:55.750544 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6985cf78fb-zpwbb_e573b9b7-1b6c-40d6-93e0-c9103105034d/manager/0.log" Dec 15 14:27:55 crc kubenswrapper[4794]: I1215 14:27:55.863446 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-85d55b5858-r7v8b_91f28ab0-ea37-4fef-87f2-4150127c276e/kube-rbac-proxy/0.log" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.001354 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54fd9dc4b5-bqsrj_c52870d2-d447-44d4-b68c-420d695b65a0/kube-rbac-proxy/0.log" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.051129 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54fd9dc4b5-bqsrj_c52870d2-d447-44d4-b68c-420d695b65a0/manager/0.log" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.204981 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f764db9b-kxkjz_9d2c6c2d-8ff4-416a-9cc6-447d855fd954/kube-rbac-proxy/0.log" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.246313 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.246930 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.308021 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.424276 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cc599445b-p7sck_d94816b6-2a4c-44fa-a7c0-811c18ec190d/kube-rbac-proxy/0.log" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.455259 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cc599445b-p7sck_d94816b6-2a4c-44fa-a7c0-811c18ec190d/manager/0.log" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.755397 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-64d7c556cd-nq78b_c7e6e262-54be-48b7-8e26-098358cab436/kube-rbac-proxy/0.log" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.781258 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-64d7c556cd-nq78b_c7e6e262-54be-48b7-8e26-098358cab436/manager/0.log" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.898741 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-58879495c-l2dll_d3e0c4d4-3eed-4e28-b8a6-4ee1a998cdcd/kube-rbac-proxy/0.log" Dec 15 14:27:56 crc kubenswrapper[4794]: I1215 14:27:56.926824 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-58879495c-l2dll_d3e0c4d4-3eed-4e28-b8a6-4ee1a998cdcd/manager/0.log" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.067037 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6b444986fd-fjddz_84d790b6-ef4b-449b-80eb-0bc812ed496f/kube-rbac-proxy/0.log" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.103524 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6b444986fd-fjddz_84d790b6-ef4b-449b-80eb-0bc812ed496f/manager/0.log" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.248503 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-d5fb87cb8-269nm_1a552228-3637-48fa-b860-64f1d63d9726/kube-rbac-proxy/0.log" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.320689 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-d5fb87cb8-269nm_1a552228-3637-48fa-b860-64f1d63d9726/manager/0.log" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.360220 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.489324 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh_0dca5b13-a635-4513-988f-48091076cff9/kube-rbac-proxy/0.log" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.521450 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-85d55b5858-r7v8b_91f28ab0-ea37-4fef-87f2-4150127c276e/manager/0.log" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.523286 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f764db9b-kxkjz_9d2c6c2d-8ff4-416a-9cc6-447d855fd954/manager/0.log" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.563961 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-74bb7fc7ffgzvnh_0dca5b13-a635-4513-988f-48091076cff9/manager/0.log" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.699885 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-cfcd4798-sk7w5_5c097f1d-1a33-480b-945a-5aa6c4e605c3/kube-rbac-proxy/0.log" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.812979 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nqbjm_da21a213-5a5f-4ce3-a3a4-c0579e46a726/registry-server/0.log" Dec 15 14:27:57 crc kubenswrapper[4794]: I1215 14:27:57.985921 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5b67cfc8fb-6bmz8_16115cae-adf6-4065-90c9-082ab050dc96/manager/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.091079 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5b67cfc8fb-6bmz8_16115cae-adf6-4065-90c9-082ab050dc96/kube-rbac-proxy/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.131951 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-cc776f956-tmnhc_20583712-205a-4875-aef3-0052b1dc4382/kube-rbac-proxy/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.263350 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-cc776f956-tmnhc_20583712-205a-4875-aef3-0052b1dc4382/manager/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.272216 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-cfcd4798-sk7w5_5c097f1d-1a33-480b-945a-5aa6c4e605c3/manager/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.326474 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-wrkfx_5c962533-9c6b-459d-92b7-768ca6a8b110/operator/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.416035 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7c9ff8845d-tgfnx_b6e33671-c04d-4fc0-825d-13355e317733/kube-rbac-proxy/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.456564 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7c9ff8845d-tgfnx_b6e33671-c04d-4fc0-825d-13355e317733/manager/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.577750 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6bc5b9c47-drrxw_5e547ae0-d6e7-4dd7-b6c1-731554f36f8d/kube-rbac-proxy/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.720253 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5d79c6465c-mp2hm_d36dacd3-9670-4927-a93f-cbc50b901ef5/kube-rbac-proxy/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.774823 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6bc5b9c47-drrxw_5e547ae0-d6e7-4dd7-b6c1-731554f36f8d/manager/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.793876 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5d79c6465c-mp2hm_d36dacd3-9670-4927-a93f-cbc50b901ef5/manager/0.log" Dec 15 14:27:58 crc kubenswrapper[4794]: I1215 14:27:58.870425 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cd5749bb8-b2v8k_92dbb135-aa8d-4392-b6b5-53bdfd6d1c40/kube-rbac-proxy/0.log" Dec 15 14:27:59 crc kubenswrapper[4794]: I1215 14:27:59.007251 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-index-sr5v2_5bd930c9-ead7-4313-9f2f-ef3df0d06af2/registry-server/0.log" Dec 15 14:27:59 crc kubenswrapper[4794]: I1215 14:27:59.175545 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cd5749bb8-b2v8k_92dbb135-aa8d-4392-b6b5-53bdfd6d1c40/manager/0.log" Dec 15 14:28:05 crc kubenswrapper[4794]: I1215 14:28:05.856441 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sq2vb"] Dec 15 14:28:05 crc kubenswrapper[4794]: I1215 14:28:05.857262 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sq2vb" podUID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerName="registry-server" containerID="cri-o://ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198" gracePeriod=2 Dec 15 14:28:06 crc kubenswrapper[4794]: E1215 14:28:06.246332 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198 is running failed: container process not found" containerID="ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198" cmd=["grpc_health_probe","-addr=:50051"] Dec 15 14:28:06 crc kubenswrapper[4794]: E1215 14:28:06.247121 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198 is running failed: container process not found" containerID="ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198" cmd=["grpc_health_probe","-addr=:50051"] Dec 15 14:28:06 crc kubenswrapper[4794]: E1215 14:28:06.247711 4794 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198 is running failed: container process not found" containerID="ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198" cmd=["grpc_health_probe","-addr=:50051"] Dec 15 14:28:06 crc kubenswrapper[4794]: E1215 14:28:06.247739 4794 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-sq2vb" podUID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerName="registry-server" Dec 15 14:28:06 crc kubenswrapper[4794]: I1215 14:28:06.383089 4794 generic.go:334] "Generic (PLEG): container finished" podID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerID="ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198" exitCode=0 Dec 15 14:28:06 crc kubenswrapper[4794]: I1215 14:28:06.383136 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2vb" event={"ID":"ecebfd8d-157a-4a2d-8f8d-902c6a37483e","Type":"ContainerDied","Data":"ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198"} Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.343781 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.392240 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2vb" event={"ID":"ecebfd8d-157a-4a2d-8f8d-902c6a37483e","Type":"ContainerDied","Data":"ad2f76aceb9aaf095967f8880979da57fdc4793cafd02cb469b711a32788407c"} Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.392629 4794 scope.go:117] "RemoveContainer" containerID="ec08a9685292d7253021cfe42adfc17abe2d8c2c9352d05e1e9a08627f1a7198" Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.392759 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sq2vb" Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.410654 4794 scope.go:117] "RemoveContainer" containerID="53b4960b0332dd3ef703a1c8ac4faaadca5f5c52a11d81c1aa46333a16169ee3" Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.454258 4794 scope.go:117] "RemoveContainer" containerID="9625062190f72b286dd1698d0621d56656121310ba425a12af42cc73dc14aeca" Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.464258 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-utilities\") pod \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.464322 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-catalog-content\") pod \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.464514 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb5rn\" (UniqueName: \"kubernetes.io/projected/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-kube-api-access-tb5rn\") pod \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\" (UID: \"ecebfd8d-157a-4a2d-8f8d-902c6a37483e\") " Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.465034 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-utilities" (OuterVolumeSpecName: "utilities") pod "ecebfd8d-157a-4a2d-8f8d-902c6a37483e" (UID: "ecebfd8d-157a-4a2d-8f8d-902c6a37483e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.465349 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.482797 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-kube-api-access-tb5rn" (OuterVolumeSpecName: "kube-api-access-tb5rn") pod "ecebfd8d-157a-4a2d-8f8d-902c6a37483e" (UID: "ecebfd8d-157a-4a2d-8f8d-902c6a37483e"). InnerVolumeSpecName "kube-api-access-tb5rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.566374 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb5rn\" (UniqueName: \"kubernetes.io/projected/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-kube-api-access-tb5rn\") on node \"crc\" DevicePath \"\"" Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.583851 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecebfd8d-157a-4a2d-8f8d-902c6a37483e" (UID: "ecebfd8d-157a-4a2d-8f8d-902c6a37483e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.667331 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecebfd8d-157a-4a2d-8f8d-902c6a37483e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.721121 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sq2vb"] Dec 15 14:28:07 crc kubenswrapper[4794]: I1215 14:28:07.727570 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sq2vb"] Dec 15 14:28:08 crc kubenswrapper[4794]: I1215 14:28:08.307672 4794 scope.go:117] "RemoveContainer" containerID="42a7538d19bb3a907b923d9c0f44f31ca268bcbc2a10a3ba0698e991113b53c5" Dec 15 14:28:08 crc kubenswrapper[4794]: I1215 14:28:08.330846 4794 scope.go:117] "RemoveContainer" containerID="8907a4744e92f2d4e3d7dbad46eb06efede8489b96338d6a9059fdbfd4d25658" Dec 15 14:28:08 crc kubenswrapper[4794]: I1215 14:28:08.373976 4794 scope.go:117] "RemoveContainer" containerID="cf96fafd5d3cba47b78d81de48266458c9434a141df497ba5d4d793783b51ff6" Dec 15 14:28:08 crc kubenswrapper[4794]: I1215 14:28:08.748221 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" path="/var/lib/kubelet/pods/ecebfd8d-157a-4a2d-8f8d-902c6a37483e/volumes" Dec 15 14:28:15 crc kubenswrapper[4794]: I1215 14:28:15.821989 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-lvgzh_15605bda-e3da-46db-8c0d-9ebfadab8bbb/control-plane-machine-set-operator/0.log" Dec 15 14:28:15 crc kubenswrapper[4794]: I1215 14:28:15.990181 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qmz6h_0d34d344-13c1-4816-8286-2104852b248b/kube-rbac-proxy/0.log" Dec 15 14:28:16 crc kubenswrapper[4794]: I1215 14:28:16.046567 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qmz6h_0d34d344-13c1-4816-8286-2104852b248b/machine-api-operator/0.log" Dec 15 14:28:24 crc kubenswrapper[4794]: I1215 14:28:24.534629 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:28:24 crc kubenswrapper[4794]: I1215 14:28:24.535204 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:28:24 crc kubenswrapper[4794]: I1215 14:28:24.535259 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 14:28:24 crc kubenswrapper[4794]: I1215 14:28:24.535951 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37758683c163a83b27c0dcff30a182ab662ba60635fa8dc89b1899be6fa8e6c5"} pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 15 14:28:24 crc kubenswrapper[4794]: I1215 14:28:24.535996 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" containerID="cri-o://37758683c163a83b27c0dcff30a182ab662ba60635fa8dc89b1899be6fa8e6c5" gracePeriod=600 Dec 15 14:28:25 crc kubenswrapper[4794]: I1215 14:28:25.557525 4794 generic.go:334] "Generic (PLEG): container finished" podID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerID="37758683c163a83b27c0dcff30a182ab662ba60635fa8dc89b1899be6fa8e6c5" exitCode=0 Dec 15 14:28:25 crc kubenswrapper[4794]: I1215 14:28:25.557602 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerDied","Data":"37758683c163a83b27c0dcff30a182ab662ba60635fa8dc89b1899be6fa8e6c5"} Dec 15 14:28:25 crc kubenswrapper[4794]: I1215 14:28:25.558213 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerStarted","Data":"da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e"} Dec 15 14:28:25 crc kubenswrapper[4794]: I1215 14:28:25.558245 4794 scope.go:117] "RemoveContainer" containerID="231beddaf036401850c97b3c88363744ce528ec8fcffbf541f1c8defc8378098" Dec 15 14:28:29 crc kubenswrapper[4794]: I1215 14:28:29.118501 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-d9dgk_e6eea6f6-edca-4fad-9a5a-b5af09663e17/cert-manager-controller/0.log" Dec 15 14:28:29 crc kubenswrapper[4794]: I1215 14:28:29.339917 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-sgfqb_52533317-8c98-49ac-b92c-3b0684586408/cert-manager-cainjector/0.log" Dec 15 14:28:29 crc kubenswrapper[4794]: I1215 14:28:29.349908 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-b67fp_c99d997c-3d17-4555-b2c9-65f58c71088b/cert-manager-webhook/0.log" Dec 15 14:28:42 crc kubenswrapper[4794]: I1215 14:28:42.397248 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-m6bgr_b9245ea6-901c-4e5a-a3be-d7184ace8e8c/nmstate-console-plugin/0.log" Dec 15 14:28:42 crc kubenswrapper[4794]: I1215 14:28:42.606818 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-p8r8k_645d79e0-7c57-4dee-8065-beffdff79fa2/nmstate-handler/0.log" Dec 15 14:28:42 crc kubenswrapper[4794]: I1215 14:28:42.632965 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-q2rhd_1ae0e754-4486-4c14-b87d-df2cfc6a94dd/nmstate-metrics/0.log" Dec 15 14:28:42 crc kubenswrapper[4794]: I1215 14:28:42.648418 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-q2rhd_1ae0e754-4486-4c14-b87d-df2cfc6a94dd/kube-rbac-proxy/0.log" Dec 15 14:28:42 crc kubenswrapper[4794]: I1215 14:28:42.825065 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-7mjgk_ed156fbf-3a73-46b0-9f8c-6f233151f987/nmstate-operator/0.log" Dec 15 14:28:42 crc kubenswrapper[4794]: I1215 14:28:42.856091 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-cblrd_32475a8d-44fa-4fd8-9b4a-e907db6bcd4e/nmstate-webhook/0.log" Dec 15 14:28:58 crc kubenswrapper[4794]: I1215 14:28:58.400440 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-jf8wk_cb3f9a4c-a711-401e-a7fe-d4cec39be7d9/kube-rbac-proxy/0.log" Dec 15 14:28:58 crc kubenswrapper[4794]: I1215 14:28:58.550428 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-jf8wk_cb3f9a4c-a711-401e-a7fe-d4cec39be7d9/controller/0.log" Dec 15 14:28:58 crc kubenswrapper[4794]: I1215 14:28:58.895274 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-frr-files/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.146010 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-frr-files/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.195066 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-metrics/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.205470 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-reloader/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.215056 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-reloader/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.365064 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-frr-files/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.379379 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-reloader/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.426523 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-metrics/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.433494 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-metrics/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.575875 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-frr-files/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.616605 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-metrics/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.631312 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/controller/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.639993 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/cp-reloader/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.821808 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/frr-metrics/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.861841 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/kube-rbac-proxy/0.log" Dec 15 14:28:59 crc kubenswrapper[4794]: I1215 14:28:59.933096 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/kube-rbac-proxy-frr/0.log" Dec 15 14:29:00 crc kubenswrapper[4794]: I1215 14:29:00.044674 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/reloader/0.log" Dec 15 14:29:00 crc kubenswrapper[4794]: I1215 14:29:00.238702 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-jr8lq_5975edc0-914a-4df8-824e-c83bfe9e2f49/frr-k8s-webhook-server/0.log" Dec 15 14:29:00 crc kubenswrapper[4794]: I1215 14:29:00.411608 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-68bfc664d8-gmqz9_66f313bf-1362-4b0b-b516-9e4be299fb48/manager/0.log" Dec 15 14:29:00 crc kubenswrapper[4794]: I1215 14:29:00.552639 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-9d7768595-tdwxc_3b7d789c-382b-41c7-af8b-78625cab6ea7/webhook-server/0.log" Dec 15 14:29:00 crc kubenswrapper[4794]: I1215 14:29:00.627758 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6c7j6_ea296da9-d8b4-41a2-834e-119076ca46a8/frr/0.log" Dec 15 14:29:00 crc kubenswrapper[4794]: I1215 14:29:00.686065 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8t6l8_433f5422-e79c-46f8-bc6d-7b6dbdaf2462/kube-rbac-proxy/0.log" Dec 15 14:29:00 crc kubenswrapper[4794]: I1215 14:29:00.945012 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8t6l8_433f5422-e79c-46f8-bc6d-7b6dbdaf2462/speaker/0.log" Dec 15 14:29:08 crc kubenswrapper[4794]: I1215 14:29:08.491944 4794 scope.go:117] "RemoveContainer" containerID="15b09cb321accff988017d1184ac1bec2e53f62bc8230fd60751c9336e69de5d" Dec 15 14:29:08 crc kubenswrapper[4794]: I1215 14:29:08.514097 4794 scope.go:117] "RemoveContainer" containerID="8e61b9bb8b8adecb2f87a5474c5e72e346b00c4bb0374303539e2979244d2b2e" Dec 15 14:29:08 crc kubenswrapper[4794]: I1215 14:29:08.541397 4794 scope.go:117] "RemoveContainer" containerID="decf9389ca33def0123c70849313cab03cd9dfa86fef8a7007d487f220b7b413" Dec 15 14:29:08 crc kubenswrapper[4794]: I1215 14:29:08.572546 4794 scope.go:117] "RemoveContainer" containerID="d3449d632eeb3ea108bf9350932219ba67827e47161b8fb69e142ecb881cd9ab" Dec 15 14:29:08 crc kubenswrapper[4794]: I1215 14:29:08.615701 4794 scope.go:117] "RemoveContainer" containerID="5b501d753f3d4c153a6bbf97d51a485c4d521a9c75ff9e1fd286362bf0ada23c" Dec 15 14:29:25 crc kubenswrapper[4794]: I1215 14:29:25.723604 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_689fabcc-a835-471a-9184-728f662139cd/init-config-reloader/0.log" Dec 15 14:29:26 crc kubenswrapper[4794]: I1215 14:29:26.069773 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_689fabcc-a835-471a-9184-728f662139cd/alertmanager/0.log" Dec 15 14:29:26 crc kubenswrapper[4794]: I1215 14:29:26.091498 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_689fabcc-a835-471a-9184-728f662139cd/init-config-reloader/0.log" Dec 15 14:29:26 crc kubenswrapper[4794]: I1215 14:29:26.128225 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_689fabcc-a835-471a-9184-728f662139cd/config-reloader/0.log" Dec 15 14:29:26 crc kubenswrapper[4794]: I1215 14:29:26.266735 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2/ceilometer-notification-agent/0.log" Dec 15 14:29:26 crc kubenswrapper[4794]: I1215 14:29:26.285063 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2/ceilometer-central-agent/0.log" Dec 15 14:29:26 crc kubenswrapper[4794]: I1215 14:29:26.336489 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2/proxy-httpd/0.log" Dec 15 14:29:26 crc kubenswrapper[4794]: I1215 14:29:26.360737 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_d9cde551-8ba3-4fdd-b53e-14bf27d7b7b2/sg-core/0.log" Dec 15 14:29:26 crc kubenswrapper[4794]: I1215 14:29:26.553209 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_keystone-757bfcdb76-mnt9m_9ae848ae-88e2-430f-97e4-f3a46d1c178c/keystone-api/0.log" Dec 15 14:29:26 crc kubenswrapper[4794]: I1215 14:29:26.601775 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_keystone-bootstrap-hmftj_b758c36f-801e-4cb6-a477-d6a81d9d04cd/keystone-bootstrap/0.log" Dec 15 14:29:26 crc kubenswrapper[4794]: I1215 14:29:26.776024 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_kube-state-metrics-0_5aa63f8b-acce-4400-8f15-8aafe966738f/kube-state-metrics/0.log" Dec 15 14:29:27 crc kubenswrapper[4794]: I1215 14:29:27.019123 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_cd2eff53-fa29-451f-ab58-ea3e9639bbea/mysql-bootstrap/0.log" Dec 15 14:29:27 crc kubenswrapper[4794]: I1215 14:29:27.218658 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_cd2eff53-fa29-451f-ab58-ea3e9639bbea/galera/0.log" Dec 15 14:29:27 crc kubenswrapper[4794]: I1215 14:29:27.260783 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_cd2eff53-fa29-451f-ab58-ea3e9639bbea/mysql-bootstrap/0.log" Dec 15 14:29:27 crc kubenswrapper[4794]: I1215 14:29:27.457647 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstackclient_5b0af393-3fce-4ed9-921e-9b56d16f4b02/openstackclient/0.log" Dec 15 14:29:27 crc kubenswrapper[4794]: I1215 14:29:27.556497 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_b356cb49-9fcf-4e1b-8a40-ed69d1418acb/init-config-reloader/0.log" Dec 15 14:29:27 crc kubenswrapper[4794]: I1215 14:29:27.819908 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_b356cb49-9fcf-4e1b-8a40-ed69d1418acb/init-config-reloader/0.log" Dec 15 14:29:27 crc kubenswrapper[4794]: I1215 14:29:27.824891 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_b356cb49-9fcf-4e1b-8a40-ed69d1418acb/config-reloader/0.log" Dec 15 14:29:27 crc kubenswrapper[4794]: I1215 14:29:27.898301 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_b356cb49-9fcf-4e1b-8a40-ed69d1418acb/prometheus/0.log" Dec 15 14:29:28 crc kubenswrapper[4794]: I1215 14:29:28.043926 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_b356cb49-9fcf-4e1b-8a40-ed69d1418acb/thanos-sidecar/0.log" Dec 15 14:29:28 crc kubenswrapper[4794]: I1215 14:29:28.173911 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_2eef62de-2115-49c1-bb86-8526606f7a69/setup-container/0.log" Dec 15 14:29:28 crc kubenswrapper[4794]: I1215 14:29:28.446936 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_2eef62de-2115-49c1-bb86-8526606f7a69/rabbitmq/0.log" Dec 15 14:29:28 crc kubenswrapper[4794]: I1215 14:29:28.484025 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_2eef62de-2115-49c1-bb86-8526606f7a69/setup-container/0.log" Dec 15 14:29:28 crc kubenswrapper[4794]: I1215 14:29:28.670085 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_823e3288-4f23-430b-843e-50f2e4230b46/setup-container/0.log" Dec 15 14:29:28 crc kubenswrapper[4794]: I1215 14:29:28.936701 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_823e3288-4f23-430b-843e-50f2e4230b46/setup-container/0.log" Dec 15 14:29:29 crc kubenswrapper[4794]: I1215 14:29:29.013968 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_823e3288-4f23-430b-843e-50f2e4230b46/rabbitmq/0.log" Dec 15 14:29:30 crc kubenswrapper[4794]: I1215 14:29:30.048529 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-hmftj"] Dec 15 14:29:30 crc kubenswrapper[4794]: I1215 14:29:30.054344 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-hmftj"] Dec 15 14:29:30 crc kubenswrapper[4794]: I1215 14:29:30.750240 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b758c36f-801e-4cb6-a477-d6a81d9d04cd" path="/var/lib/kubelet/pods/b758c36f-801e-4cb6-a477-d6a81d9d04cd/volumes" Dec 15 14:29:36 crc kubenswrapper[4794]: I1215 14:29:36.410342 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_memcached-0_52de7ae7-7c3b-4931-9313-cc3ebbe1a4f4/memcached/0.log" Dec 15 14:29:47 crc kubenswrapper[4794]: I1215 14:29:47.309818 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn_b00203fa-f8ca-49f8-9a8d-1908f8414ead/util/0.log" Dec 15 14:29:47 crc kubenswrapper[4794]: I1215 14:29:47.482483 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn_b00203fa-f8ca-49f8-9a8d-1908f8414ead/util/0.log" Dec 15 14:29:47 crc kubenswrapper[4794]: I1215 14:29:47.576663 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn_b00203fa-f8ca-49f8-9a8d-1908f8414ead/pull/0.log" Dec 15 14:29:47 crc kubenswrapper[4794]: I1215 14:29:47.601963 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn_b00203fa-f8ca-49f8-9a8d-1908f8414ead/pull/0.log" Dec 15 14:29:47 crc kubenswrapper[4794]: I1215 14:29:47.834680 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn_b00203fa-f8ca-49f8-9a8d-1908f8414ead/pull/0.log" Dec 15 14:29:47 crc kubenswrapper[4794]: I1215 14:29:47.841887 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn_b00203fa-f8ca-49f8-9a8d-1908f8414ead/extract/0.log" Dec 15 14:29:47 crc kubenswrapper[4794]: I1215 14:29:47.868493 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931adjwdn_b00203fa-f8ca-49f8-9a8d-1908f8414ead/util/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.006068 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp_9913031c-bd75-4a6b-b917-338f5d8afbe4/util/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.241791 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp_9913031c-bd75-4a6b-b917-338f5d8afbe4/util/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.249174 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp_9913031c-bd75-4a6b-b917-338f5d8afbe4/pull/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.303557 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp_9913031c-bd75-4a6b-b917-338f5d8afbe4/pull/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.419648 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp_9913031c-bd75-4a6b-b917-338f5d8afbe4/util/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.470575 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp_9913031c-bd75-4a6b-b917-338f5d8afbe4/pull/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.502663 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4tc7dp_9913031c-bd75-4a6b-b917-338f5d8afbe4/extract/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.601130 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578_736d163e-7de0-4138-8a7e-74db6b7d5efc/util/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.807833 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578_736d163e-7de0-4138-8a7e-74db6b7d5efc/pull/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.817698 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578_736d163e-7de0-4138-8a7e-74db6b7d5efc/util/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.835337 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578_736d163e-7de0-4138-8a7e-74db6b7d5efc/pull/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.995828 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578_736d163e-7de0-4138-8a7e-74db6b7d5efc/util/0.log" Dec 15 14:29:48 crc kubenswrapper[4794]: I1215 14:29:48.997414 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578_736d163e-7de0-4138-8a7e-74db6b7d5efc/extract/0.log" Dec 15 14:29:49 crc kubenswrapper[4794]: I1215 14:29:49.022271 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rv578_736d163e-7de0-4138-8a7e-74db6b7d5efc/pull/0.log" Dec 15 14:29:49 crc kubenswrapper[4794]: I1215 14:29:49.194953 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d_81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46/util/0.log" Dec 15 14:29:49 crc kubenswrapper[4794]: I1215 14:29:49.426791 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d_81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46/util/0.log" Dec 15 14:29:49 crc kubenswrapper[4794]: I1215 14:29:49.448242 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d_81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46/pull/0.log" Dec 15 14:29:49 crc kubenswrapper[4794]: I1215 14:29:49.469946 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d_81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46/pull/0.log" Dec 15 14:29:49 crc kubenswrapper[4794]: I1215 14:29:49.597417 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d_81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46/util/0.log" Dec 15 14:29:49 crc kubenswrapper[4794]: I1215 14:29:49.669566 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d_81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46/pull/0.log" Dec 15 14:29:49 crc kubenswrapper[4794]: I1215 14:29:49.710820 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8t579d_81ef22db-d2c9-404e-ae6b-e8a9aa3ddf46/extract/0.log" Dec 15 14:29:49 crc kubenswrapper[4794]: I1215 14:29:49.848228 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cqqqj_007bbddd-2d30-424a-b855-a317f4b14c3d/extract-utilities/0.log" Dec 15 14:29:50 crc kubenswrapper[4794]: I1215 14:29:50.309427 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cqqqj_007bbddd-2d30-424a-b855-a317f4b14c3d/extract-content/0.log" Dec 15 14:29:50 crc kubenswrapper[4794]: I1215 14:29:50.374554 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cqqqj_007bbddd-2d30-424a-b855-a317f4b14c3d/extract-utilities/0.log" Dec 15 14:29:50 crc kubenswrapper[4794]: I1215 14:29:50.380927 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cqqqj_007bbddd-2d30-424a-b855-a317f4b14c3d/extract-content/0.log" Dec 15 14:29:50 crc kubenswrapper[4794]: I1215 14:29:50.599027 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cqqqj_007bbddd-2d30-424a-b855-a317f4b14c3d/extract-utilities/0.log" Dec 15 14:29:50 crc kubenswrapper[4794]: I1215 14:29:50.599475 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cqqqj_007bbddd-2d30-424a-b855-a317f4b14c3d/extract-content/0.log" Dec 15 14:29:50 crc kubenswrapper[4794]: I1215 14:29:50.832007 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qjdr_69c600b2-aa35-4a90-b45c-d15d6b2650d3/extract-utilities/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.079416 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cqqqj_007bbddd-2d30-424a-b855-a317f4b14c3d/registry-server/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.088283 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qjdr_69c600b2-aa35-4a90-b45c-d15d6b2650d3/extract-utilities/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.149483 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qjdr_69c600b2-aa35-4a90-b45c-d15d6b2650d3/extract-content/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.165989 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qjdr_69c600b2-aa35-4a90-b45c-d15d6b2650d3/extract-content/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.336189 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qjdr_69c600b2-aa35-4a90-b45c-d15d6b2650d3/extract-utilities/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.344851 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qjdr_69c600b2-aa35-4a90-b45c-d15d6b2650d3/extract-content/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.462333 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-j9rlt_fb02c0bd-f248-4bad-b91c-ed3581cda0bb/marketplace-operator/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.653564 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6qjdr_69c600b2-aa35-4a90-b45c-d15d6b2650d3/registry-server/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.670473 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q8dqr_5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf/extract-utilities/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.837140 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q8dqr_5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf/extract-utilities/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.845861 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q8dqr_5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf/extract-content/0.log" Dec 15 14:29:51 crc kubenswrapper[4794]: I1215 14:29:51.847296 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q8dqr_5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf/extract-content/0.log" Dec 15 14:29:52 crc kubenswrapper[4794]: I1215 14:29:52.080822 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q8dqr_5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf/extract-utilities/0.log" Dec 15 14:29:52 crc kubenswrapper[4794]: I1215 14:29:52.110656 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q8dqr_5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf/extract-content/0.log" Dec 15 14:29:52 crc kubenswrapper[4794]: I1215 14:29:52.113558 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q8dqr_5ccf70d9-cc87-4a5f-acd1-0f07a6054aaf/registry-server/0.log" Dec 15 14:29:52 crc kubenswrapper[4794]: I1215 14:29:52.149977 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hl98z_1058fe26-4e3d-428c-9fcc-079b0efc1a33/extract-utilities/0.log" Dec 15 14:29:52 crc kubenswrapper[4794]: I1215 14:29:52.298534 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hl98z_1058fe26-4e3d-428c-9fcc-079b0efc1a33/extract-content/0.log" Dec 15 14:29:52 crc kubenswrapper[4794]: I1215 14:29:52.312664 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hl98z_1058fe26-4e3d-428c-9fcc-079b0efc1a33/extract-content/0.log" Dec 15 14:29:52 crc kubenswrapper[4794]: I1215 14:29:52.334140 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hl98z_1058fe26-4e3d-428c-9fcc-079b0efc1a33/extract-utilities/0.log" Dec 15 14:29:52 crc kubenswrapper[4794]: I1215 14:29:52.487296 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hl98z_1058fe26-4e3d-428c-9fcc-079b0efc1a33/extract-content/0.log" Dec 15 14:29:52 crc kubenswrapper[4794]: I1215 14:29:52.543633 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hl98z_1058fe26-4e3d-428c-9fcc-079b0efc1a33/extract-utilities/0.log" Dec 15 14:29:52 crc kubenswrapper[4794]: I1215 14:29:52.828749 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hl98z_1058fe26-4e3d-428c-9fcc-079b0efc1a33/registry-server/0.log" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.142651 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2"] Dec 15 14:30:00 crc kubenswrapper[4794]: E1215 14:30:00.144118 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerName="extract-content" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.144221 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerName="extract-content" Dec 15 14:30:00 crc kubenswrapper[4794]: E1215 14:30:00.144330 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerName="extract-utilities" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.144413 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerName="extract-utilities" Dec 15 14:30:00 crc kubenswrapper[4794]: E1215 14:30:00.144507 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerName="registry-server" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.144613 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerName="registry-server" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.144931 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecebfd8d-157a-4a2d-8f8d-902c6a37483e" containerName="registry-server" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.145842 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.147678 4794 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.149207 4794 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.153779 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2"] Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.336277 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-secret-volume\") pod \"collect-profiles-29430150-qhcz2\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.336334 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-config-volume\") pod \"collect-profiles-29430150-qhcz2\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.336399 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr6f9\" (UniqueName: \"kubernetes.io/projected/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-kube-api-access-nr6f9\") pod \"collect-profiles-29430150-qhcz2\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.437499 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-secret-volume\") pod \"collect-profiles-29430150-qhcz2\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.437568 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-config-volume\") pod \"collect-profiles-29430150-qhcz2\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.437653 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr6f9\" (UniqueName: \"kubernetes.io/projected/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-kube-api-access-nr6f9\") pod \"collect-profiles-29430150-qhcz2\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.438816 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-config-volume\") pod \"collect-profiles-29430150-qhcz2\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.443707 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-secret-volume\") pod \"collect-profiles-29430150-qhcz2\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.457284 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr6f9\" (UniqueName: \"kubernetes.io/projected/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-kube-api-access-nr6f9\") pod \"collect-profiles-29430150-qhcz2\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.474161 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:00 crc kubenswrapper[4794]: I1215 14:30:00.898460 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2"] Dec 15 14:30:01 crc kubenswrapper[4794]: I1215 14:30:01.320052 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" event={"ID":"1b3307d8-4ffd-493a-9b6b-877718d4d5d4","Type":"ContainerStarted","Data":"387470cb2d5bfc196635ad878b4bd5cc4e9b3ca319b5aa62d456ac2542360874"} Dec 15 14:30:01 crc kubenswrapper[4794]: I1215 14:30:01.320463 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" event={"ID":"1b3307d8-4ffd-493a-9b6b-877718d4d5d4","Type":"ContainerStarted","Data":"287b6ebe1924f2d7b95edd59390cacbf01529b512e4010a2c3a48fe4c83bb2c6"} Dec 15 14:30:02 crc kubenswrapper[4794]: I1215 14:30:02.328453 4794 generic.go:334] "Generic (PLEG): container finished" podID="1b3307d8-4ffd-493a-9b6b-877718d4d5d4" containerID="387470cb2d5bfc196635ad878b4bd5cc4e9b3ca319b5aa62d456ac2542360874" exitCode=0 Dec 15 14:30:02 crc kubenswrapper[4794]: I1215 14:30:02.328522 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" event={"ID":"1b3307d8-4ffd-493a-9b6b-877718d4d5d4","Type":"ContainerDied","Data":"387470cb2d5bfc196635ad878b4bd5cc4e9b3ca319b5aa62d456ac2542360874"} Dec 15 14:30:03 crc kubenswrapper[4794]: I1215 14:30:03.676297 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:03 crc kubenswrapper[4794]: I1215 14:30:03.684263 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr6f9\" (UniqueName: \"kubernetes.io/projected/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-kube-api-access-nr6f9\") pod \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " Dec 15 14:30:03 crc kubenswrapper[4794]: I1215 14:30:03.684353 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-secret-volume\") pod \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " Dec 15 14:30:03 crc kubenswrapper[4794]: I1215 14:30:03.684384 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-config-volume\") pod \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\" (UID: \"1b3307d8-4ffd-493a-9b6b-877718d4d5d4\") " Dec 15 14:30:03 crc kubenswrapper[4794]: I1215 14:30:03.684979 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-config-volume" (OuterVolumeSpecName: "config-volume") pod "1b3307d8-4ffd-493a-9b6b-877718d4d5d4" (UID: "1b3307d8-4ffd-493a-9b6b-877718d4d5d4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 15 14:30:03 crc kubenswrapper[4794]: I1215 14:30:03.690873 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-kube-api-access-nr6f9" (OuterVolumeSpecName: "kube-api-access-nr6f9") pod "1b3307d8-4ffd-493a-9b6b-877718d4d5d4" (UID: "1b3307d8-4ffd-493a-9b6b-877718d4d5d4"). InnerVolumeSpecName "kube-api-access-nr6f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:30:03 crc kubenswrapper[4794]: I1215 14:30:03.712739 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1b3307d8-4ffd-493a-9b6b-877718d4d5d4" (UID: "1b3307d8-4ffd-493a-9b6b-877718d4d5d4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 15 14:30:03 crc kubenswrapper[4794]: I1215 14:30:03.786054 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr6f9\" (UniqueName: \"kubernetes.io/projected/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-kube-api-access-nr6f9\") on node \"crc\" DevicePath \"\"" Dec 15 14:30:03 crc kubenswrapper[4794]: I1215 14:30:03.786091 4794 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 15 14:30:03 crc kubenswrapper[4794]: I1215 14:30:03.786102 4794 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b3307d8-4ffd-493a-9b6b-877718d4d5d4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 15 14:30:04 crc kubenswrapper[4794]: I1215 14:30:04.345799 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" event={"ID":"1b3307d8-4ffd-493a-9b6b-877718d4d5d4","Type":"ContainerDied","Data":"287b6ebe1924f2d7b95edd59390cacbf01529b512e4010a2c3a48fe4c83bb2c6"} Dec 15 14:30:04 crc kubenswrapper[4794]: I1215 14:30:04.346134 4794 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="287b6ebe1924f2d7b95edd59390cacbf01529b512e4010a2c3a48fe4c83bb2c6" Dec 15 14:30:04 crc kubenswrapper[4794]: I1215 14:30:04.345906 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29430150-qhcz2" Dec 15 14:30:04 crc kubenswrapper[4794]: I1215 14:30:04.408545 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt"] Dec 15 14:30:04 crc kubenswrapper[4794]: I1215 14:30:04.414893 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29430105-2hmzt"] Dec 15 14:30:04 crc kubenswrapper[4794]: I1215 14:30:04.746344 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da9394b5-19ce-4482-b107-9339d9813a25" path="/var/lib/kubelet/pods/da9394b5-19ce-4482-b107-9339d9813a25/volumes" Dec 15 14:30:05 crc kubenswrapper[4794]: I1215 14:30:05.809403 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-d4bvp_661de35e-03be-4902-9273-ae4f7d165a16/prometheus-operator/0.log" Dec 15 14:30:06 crc kubenswrapper[4794]: I1215 14:30:06.086063 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-d478fb5f7-qs8s8_097c120c-be56-4089-b752-36706c337bcf/prometheus-operator-admission-webhook/0.log" Dec 15 14:30:06 crc kubenswrapper[4794]: I1215 14:30:06.164730 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-d478fb5f7-s27z9_01c0c788-8deb-456f-b515-255500832030/prometheus-operator-admission-webhook/0.log" Dec 15 14:30:06 crc kubenswrapper[4794]: I1215 14:30:06.319103 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-5cd75_f160af63-166e-45de-8a47-cf3fbda615ed/operator/0.log" Dec 15 14:30:06 crc kubenswrapper[4794]: I1215 14:30:06.333541 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-qrj6q_7f2996dd-6dd7-4d33-a893-3e8f27b82ad0/observability-ui-dashboards/0.log" Dec 15 14:30:06 crc kubenswrapper[4794]: I1215 14:30:06.478612 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-qsqps_981129d7-9a49-4888-a19c-3c2924e854c8/perses-operator/0.log" Dec 15 14:30:08 crc kubenswrapper[4794]: I1215 14:30:08.755387 4794 scope.go:117] "RemoveContainer" containerID="122c37339c404c5df3288e8575a3bfb34a8ce77160701718e71e9a9d6f2177bc" Dec 15 14:30:08 crc kubenswrapper[4794]: I1215 14:30:08.775534 4794 scope.go:117] "RemoveContainer" containerID="2984520b934caa937c9ae0139a16877a7696a351a94545275d12cf034ff8ddd2" Dec 15 14:30:08 crc kubenswrapper[4794]: I1215 14:30:08.840104 4794 scope.go:117] "RemoveContainer" containerID="0458b18c5616718b1599c32062ac677c77b74357e975dc80212cb5a039a5fb84" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.502231 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b9g9j"] Dec 15 14:30:11 crc kubenswrapper[4794]: E1215 14:30:11.503185 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3307d8-4ffd-493a-9b6b-877718d4d5d4" containerName="collect-profiles" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.503199 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3307d8-4ffd-493a-9b6b-877718d4d5d4" containerName="collect-profiles" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.503355 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3307d8-4ffd-493a-9b6b-877718d4d5d4" containerName="collect-profiles" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.507595 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.513989 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-utilities\") pod \"redhat-marketplace-b9g9j\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.514076 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz8j7\" (UniqueName: \"kubernetes.io/projected/477c77cf-b966-4b70-b86f-c135006c56e8-kube-api-access-mz8j7\") pod \"redhat-marketplace-b9g9j\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.514146 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-catalog-content\") pod \"redhat-marketplace-b9g9j\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.514687 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9g9j"] Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.615881 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-utilities\") pod \"redhat-marketplace-b9g9j\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.615969 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz8j7\" (UniqueName: \"kubernetes.io/projected/477c77cf-b966-4b70-b86f-c135006c56e8-kube-api-access-mz8j7\") pod \"redhat-marketplace-b9g9j\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.616041 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-catalog-content\") pod \"redhat-marketplace-b9g9j\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.616455 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-utilities\") pod \"redhat-marketplace-b9g9j\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.616612 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-catalog-content\") pod \"redhat-marketplace-b9g9j\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.638516 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz8j7\" (UniqueName: \"kubernetes.io/projected/477c77cf-b966-4b70-b86f-c135006c56e8-kube-api-access-mz8j7\") pod \"redhat-marketplace-b9g9j\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:11 crc kubenswrapper[4794]: I1215 14:30:11.841233 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:12 crc kubenswrapper[4794]: I1215 14:30:12.117988 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9g9j"] Dec 15 14:30:12 crc kubenswrapper[4794]: I1215 14:30:12.408807 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9g9j" event={"ID":"477c77cf-b966-4b70-b86f-c135006c56e8","Type":"ContainerStarted","Data":"abec460b43a61c1e67dde439c8cca1d1a7c601640aa7056ec6970eb38d7f9155"} Dec 15 14:30:12 crc kubenswrapper[4794]: I1215 14:30:12.409067 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9g9j" event={"ID":"477c77cf-b966-4b70-b86f-c135006c56e8","Type":"ContainerStarted","Data":"c9fc0ac9752df2a3021fcdff3113c30fe4db21e37e0d92955d790ab06d92a738"} Dec 15 14:30:13 crc kubenswrapper[4794]: I1215 14:30:13.417111 4794 generic.go:334] "Generic (PLEG): container finished" podID="477c77cf-b966-4b70-b86f-c135006c56e8" containerID="abec460b43a61c1e67dde439c8cca1d1a7c601640aa7056ec6970eb38d7f9155" exitCode=0 Dec 15 14:30:13 crc kubenswrapper[4794]: I1215 14:30:13.417154 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9g9j" event={"ID":"477c77cf-b966-4b70-b86f-c135006c56e8","Type":"ContainerDied","Data":"abec460b43a61c1e67dde439c8cca1d1a7c601640aa7056ec6970eb38d7f9155"} Dec 15 14:30:14 crc kubenswrapper[4794]: I1215 14:30:14.427386 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9g9j" event={"ID":"477c77cf-b966-4b70-b86f-c135006c56e8","Type":"ContainerStarted","Data":"36644f7d36dab934125dbfc1f65355e6875421f8273e3404f9bcc9235a007124"} Dec 15 14:30:15 crc kubenswrapper[4794]: I1215 14:30:15.437511 4794 generic.go:334] "Generic (PLEG): container finished" podID="477c77cf-b966-4b70-b86f-c135006c56e8" containerID="36644f7d36dab934125dbfc1f65355e6875421f8273e3404f9bcc9235a007124" exitCode=0 Dec 15 14:30:15 crc kubenswrapper[4794]: I1215 14:30:15.437916 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9g9j" event={"ID":"477c77cf-b966-4b70-b86f-c135006c56e8","Type":"ContainerDied","Data":"36644f7d36dab934125dbfc1f65355e6875421f8273e3404f9bcc9235a007124"} Dec 15 14:30:16 crc kubenswrapper[4794]: I1215 14:30:16.450824 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9g9j" event={"ID":"477c77cf-b966-4b70-b86f-c135006c56e8","Type":"ContainerStarted","Data":"d2a328bbd5a09985847465f9cfe5b4ccd9d18b3bd3ad45ad188a4e239c94a17a"} Dec 15 14:30:16 crc kubenswrapper[4794]: I1215 14:30:16.473869 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b9g9j" podStartSLOduration=2.990254018 podStartE2EDuration="5.473854089s" podCreationTimestamp="2025-12-15 14:30:11 +0000 UTC" firstStartedPulling="2025-12-15 14:30:13.419445216 +0000 UTC m=+2175.271467654" lastFinishedPulling="2025-12-15 14:30:15.903045287 +0000 UTC m=+2177.755067725" observedRunningTime="2025-12-15 14:30:16.471265155 +0000 UTC m=+2178.323287613" watchObservedRunningTime="2025-12-15 14:30:16.473854089 +0000 UTC m=+2178.325876527" Dec 15 14:30:21 crc kubenswrapper[4794]: I1215 14:30:21.842470 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:21 crc kubenswrapper[4794]: I1215 14:30:21.843099 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:21 crc kubenswrapper[4794]: I1215 14:30:21.914029 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:22 crc kubenswrapper[4794]: I1215 14:30:22.544016 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:24 crc kubenswrapper[4794]: I1215 14:30:24.534313 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:30:24 crc kubenswrapper[4794]: I1215 14:30:24.534373 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:30:25 crc kubenswrapper[4794]: I1215 14:30:25.491028 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9g9j"] Dec 15 14:30:25 crc kubenswrapper[4794]: I1215 14:30:25.491529 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b9g9j" podUID="477c77cf-b966-4b70-b86f-c135006c56e8" containerName="registry-server" containerID="cri-o://d2a328bbd5a09985847465f9cfe5b4ccd9d18b3bd3ad45ad188a4e239c94a17a" gracePeriod=2 Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.556066 4794 generic.go:334] "Generic (PLEG): container finished" podID="477c77cf-b966-4b70-b86f-c135006c56e8" containerID="d2a328bbd5a09985847465f9cfe5b4ccd9d18b3bd3ad45ad188a4e239c94a17a" exitCode=0 Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.556246 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9g9j" event={"ID":"477c77cf-b966-4b70-b86f-c135006c56e8","Type":"ContainerDied","Data":"d2a328bbd5a09985847465f9cfe5b4ccd9d18b3bd3ad45ad188a4e239c94a17a"} Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.681269 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.797441 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-catalog-content\") pod \"477c77cf-b966-4b70-b86f-c135006c56e8\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.797768 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz8j7\" (UniqueName: \"kubernetes.io/projected/477c77cf-b966-4b70-b86f-c135006c56e8-kube-api-access-mz8j7\") pod \"477c77cf-b966-4b70-b86f-c135006c56e8\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.797858 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-utilities\") pod \"477c77cf-b966-4b70-b86f-c135006c56e8\" (UID: \"477c77cf-b966-4b70-b86f-c135006c56e8\") " Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.798740 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-utilities" (OuterVolumeSpecName: "utilities") pod "477c77cf-b966-4b70-b86f-c135006c56e8" (UID: "477c77cf-b966-4b70-b86f-c135006c56e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.814000 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/477c77cf-b966-4b70-b86f-c135006c56e8-kube-api-access-mz8j7" (OuterVolumeSpecName: "kube-api-access-mz8j7") pod "477c77cf-b966-4b70-b86f-c135006c56e8" (UID: "477c77cf-b966-4b70-b86f-c135006c56e8"). InnerVolumeSpecName "kube-api-access-mz8j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.818161 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "477c77cf-b966-4b70-b86f-c135006c56e8" (UID: "477c77cf-b966-4b70-b86f-c135006c56e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.900022 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz8j7\" (UniqueName: \"kubernetes.io/projected/477c77cf-b966-4b70-b86f-c135006c56e8-kube-api-access-mz8j7\") on node \"crc\" DevicePath \"\"" Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.900068 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:30:28 crc kubenswrapper[4794]: I1215 14:30:28.900081 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477c77cf-b966-4b70-b86f-c135006c56e8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:30:29 crc kubenswrapper[4794]: I1215 14:30:29.565708 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9g9j" event={"ID":"477c77cf-b966-4b70-b86f-c135006c56e8","Type":"ContainerDied","Data":"c9fc0ac9752df2a3021fcdff3113c30fe4db21e37e0d92955d790ab06d92a738"} Dec 15 14:30:29 crc kubenswrapper[4794]: I1215 14:30:29.566733 4794 scope.go:117] "RemoveContainer" containerID="d2a328bbd5a09985847465f9cfe5b4ccd9d18b3bd3ad45ad188a4e239c94a17a" Dec 15 14:30:29 crc kubenswrapper[4794]: I1215 14:30:29.565761 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9g9j" Dec 15 14:30:29 crc kubenswrapper[4794]: I1215 14:30:29.603894 4794 scope.go:117] "RemoveContainer" containerID="36644f7d36dab934125dbfc1f65355e6875421f8273e3404f9bcc9235a007124" Dec 15 14:30:29 crc kubenswrapper[4794]: I1215 14:30:29.604589 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9g9j"] Dec 15 14:30:29 crc kubenswrapper[4794]: I1215 14:30:29.611878 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9g9j"] Dec 15 14:30:29 crc kubenswrapper[4794]: I1215 14:30:29.639597 4794 scope.go:117] "RemoveContainer" containerID="abec460b43a61c1e67dde439c8cca1d1a7c601640aa7056ec6970eb38d7f9155" Dec 15 14:30:30 crc kubenswrapper[4794]: I1215 14:30:30.747404 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="477c77cf-b966-4b70-b86f-c135006c56e8" path="/var/lib/kubelet/pods/477c77cf-b966-4b70-b86f-c135006c56e8/volumes" Dec 15 14:30:54 crc kubenswrapper[4794]: I1215 14:30:54.534117 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:30:54 crc kubenswrapper[4794]: I1215 14:30:54.534650 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.701155 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r9kdp"] Dec 15 14:31:05 crc kubenswrapper[4794]: E1215 14:31:05.702298 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477c77cf-b966-4b70-b86f-c135006c56e8" containerName="extract-utilities" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.702311 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="477c77cf-b966-4b70-b86f-c135006c56e8" containerName="extract-utilities" Dec 15 14:31:05 crc kubenswrapper[4794]: E1215 14:31:05.702330 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477c77cf-b966-4b70-b86f-c135006c56e8" containerName="registry-server" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.702336 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="477c77cf-b966-4b70-b86f-c135006c56e8" containerName="registry-server" Dec 15 14:31:05 crc kubenswrapper[4794]: E1215 14:31:05.702344 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477c77cf-b966-4b70-b86f-c135006c56e8" containerName="extract-content" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.702351 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="477c77cf-b966-4b70-b86f-c135006c56e8" containerName="extract-content" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.702530 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="477c77cf-b966-4b70-b86f-c135006c56e8" containerName="registry-server" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.703844 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.710950 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9kdp"] Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.776150 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtkf7\" (UniqueName: \"kubernetes.io/projected/0df6de8b-b299-4699-ae2b-485024547ee7-kube-api-access-mtkf7\") pod \"community-operators-r9kdp\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.776192 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-utilities\") pod \"community-operators-r9kdp\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.776267 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-catalog-content\") pod \"community-operators-r9kdp\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.877705 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtkf7\" (UniqueName: \"kubernetes.io/projected/0df6de8b-b299-4699-ae2b-485024547ee7-kube-api-access-mtkf7\") pod \"community-operators-r9kdp\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.877747 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-utilities\") pod \"community-operators-r9kdp\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.877793 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-catalog-content\") pod \"community-operators-r9kdp\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.878220 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-utilities\") pod \"community-operators-r9kdp\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.878325 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-catalog-content\") pod \"community-operators-r9kdp\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:05 crc kubenswrapper[4794]: I1215 14:31:05.913834 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtkf7\" (UniqueName: \"kubernetes.io/projected/0df6de8b-b299-4699-ae2b-485024547ee7-kube-api-access-mtkf7\") pod \"community-operators-r9kdp\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:06 crc kubenswrapper[4794]: I1215 14:31:06.023035 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:06 crc kubenswrapper[4794]: I1215 14:31:06.584152 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9kdp"] Dec 15 14:31:06 crc kubenswrapper[4794]: I1215 14:31:06.888244 4794 generic.go:334] "Generic (PLEG): container finished" podID="0df6de8b-b299-4699-ae2b-485024547ee7" containerID="037babc52f9e9c8bbf66a7bb970aa0b041ffc3d9c162b37725aa52a767221cde" exitCode=0 Dec 15 14:31:06 crc kubenswrapper[4794]: I1215 14:31:06.888353 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9kdp" event={"ID":"0df6de8b-b299-4699-ae2b-485024547ee7","Type":"ContainerDied","Data":"037babc52f9e9c8bbf66a7bb970aa0b041ffc3d9c162b37725aa52a767221cde"} Dec 15 14:31:06 crc kubenswrapper[4794]: I1215 14:31:06.888505 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9kdp" event={"ID":"0df6de8b-b299-4699-ae2b-485024547ee7","Type":"ContainerStarted","Data":"d5d595e28a7ac5c71ad4157d3b827e0798cdcb8e345c913a9b29827c74fd5cf4"} Dec 15 14:31:08 crc kubenswrapper[4794]: I1215 14:31:08.964362 4794 scope.go:117] "RemoveContainer" containerID="57eefbc6dd361dfdd7b6f564940323c6d04f3cb3ba5549ff8205f272b7d8a6d1" Dec 15 14:31:08 crc kubenswrapper[4794]: I1215 14:31:08.984493 4794 scope.go:117] "RemoveContainer" containerID="099e44b5f21d9d55d2f7b9319227c38c45c0cdf0f4f68cadafd245465d28aabc" Dec 15 14:31:09 crc kubenswrapper[4794]: I1215 14:31:09.043971 4794 scope.go:117] "RemoveContainer" containerID="42359eed6935d78072ff38ffc73692efba5b34f01f228cb069eec87ebef740ce" Dec 15 14:31:09 crc kubenswrapper[4794]: I1215 14:31:09.076546 4794 scope.go:117] "RemoveContainer" containerID="0ea076a88d3f6e8e404686548685e5731f96cfdb2a6322e9aa999ab38c93ed7d" Dec 15 14:31:11 crc kubenswrapper[4794]: I1215 14:31:11.936767 4794 generic.go:334] "Generic (PLEG): container finished" podID="0df6de8b-b299-4699-ae2b-485024547ee7" containerID="8471bab609d3729266fdc883f2017d59be35cd3b27cc173c135ba7b098ff2165" exitCode=0 Dec 15 14:31:11 crc kubenswrapper[4794]: I1215 14:31:11.936870 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9kdp" event={"ID":"0df6de8b-b299-4699-ae2b-485024547ee7","Type":"ContainerDied","Data":"8471bab609d3729266fdc883f2017d59be35cd3b27cc173c135ba7b098ff2165"} Dec 15 14:31:11 crc kubenswrapper[4794]: I1215 14:31:11.939148 4794 generic.go:334] "Generic (PLEG): container finished" podID="656e9896-c532-43f3-bfa5-5009102e1cfa" containerID="61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674" exitCode=0 Dec 15 14:31:11 crc kubenswrapper[4794]: I1215 14:31:11.939183 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-695j6/must-gather-sh5td" event={"ID":"656e9896-c532-43f3-bfa5-5009102e1cfa","Type":"ContainerDied","Data":"61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674"} Dec 15 14:31:11 crc kubenswrapper[4794]: I1215 14:31:11.939716 4794 scope.go:117] "RemoveContainer" containerID="61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674" Dec 15 14:31:12 crc kubenswrapper[4794]: I1215 14:31:12.256946 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-695j6_must-gather-sh5td_656e9896-c532-43f3-bfa5-5009102e1cfa/gather/0.log" Dec 15 14:31:13 crc kubenswrapper[4794]: I1215 14:31:13.969933 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9kdp" event={"ID":"0df6de8b-b299-4699-ae2b-485024547ee7","Type":"ContainerStarted","Data":"2a07ca014cf53b4f0eeb9080d1a03ae9b9729cca8de11a898331927b947ec60c"} Dec 15 14:31:13 crc kubenswrapper[4794]: I1215 14:31:13.991017 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r9kdp" podStartSLOduration=3.040012336 podStartE2EDuration="8.991000512s" podCreationTimestamp="2025-12-15 14:31:05 +0000 UTC" firstStartedPulling="2025-12-15 14:31:06.890557601 +0000 UTC m=+2228.742580059" lastFinishedPulling="2025-12-15 14:31:12.841545797 +0000 UTC m=+2234.693568235" observedRunningTime="2025-12-15 14:31:13.986125084 +0000 UTC m=+2235.838147522" watchObservedRunningTime="2025-12-15 14:31:13.991000512 +0000 UTC m=+2235.843022950" Dec 15 14:31:16 crc kubenswrapper[4794]: I1215 14:31:16.023623 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:16 crc kubenswrapper[4794]: I1215 14:31:16.023919 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:16 crc kubenswrapper[4794]: I1215 14:31:16.078021 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:19 crc kubenswrapper[4794]: I1215 14:31:19.479657 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-695j6/must-gather-sh5td"] Dec 15 14:31:19 crc kubenswrapper[4794]: I1215 14:31:19.480140 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-695j6/must-gather-sh5td" podUID="656e9896-c532-43f3-bfa5-5009102e1cfa" containerName="copy" containerID="cri-o://059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44" gracePeriod=2 Dec 15 14:31:19 crc kubenswrapper[4794]: I1215 14:31:19.485595 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-695j6/must-gather-sh5td"] Dec 15 14:31:19 crc kubenswrapper[4794]: I1215 14:31:19.894006 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-695j6_must-gather-sh5td_656e9896-c532-43f3-bfa5-5009102e1cfa/copy/0.log" Dec 15 14:31:19 crc kubenswrapper[4794]: I1215 14:31:19.894349 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-695j6/must-gather-sh5td" Dec 15 14:31:19 crc kubenswrapper[4794]: I1215 14:31:19.979159 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjf2s\" (UniqueName: \"kubernetes.io/projected/656e9896-c532-43f3-bfa5-5009102e1cfa-kube-api-access-sjf2s\") pod \"656e9896-c532-43f3-bfa5-5009102e1cfa\" (UID: \"656e9896-c532-43f3-bfa5-5009102e1cfa\") " Dec 15 14:31:19 crc kubenswrapper[4794]: I1215 14:31:19.979598 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/656e9896-c532-43f3-bfa5-5009102e1cfa-must-gather-output\") pod \"656e9896-c532-43f3-bfa5-5009102e1cfa\" (UID: \"656e9896-c532-43f3-bfa5-5009102e1cfa\") " Dec 15 14:31:19 crc kubenswrapper[4794]: I1215 14:31:19.984214 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/656e9896-c532-43f3-bfa5-5009102e1cfa-kube-api-access-sjf2s" (OuterVolumeSpecName: "kube-api-access-sjf2s") pod "656e9896-c532-43f3-bfa5-5009102e1cfa" (UID: "656e9896-c532-43f3-bfa5-5009102e1cfa"). InnerVolumeSpecName "kube-api-access-sjf2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.025803 4794 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-695j6_must-gather-sh5td_656e9896-c532-43f3-bfa5-5009102e1cfa/copy/0.log" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.026352 4794 generic.go:334] "Generic (PLEG): container finished" podID="656e9896-c532-43f3-bfa5-5009102e1cfa" containerID="059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44" exitCode=143 Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.026477 4794 scope.go:117] "RemoveContainer" containerID="059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.026654 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-695j6/must-gather-sh5td" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.061388 4794 scope.go:117] "RemoveContainer" containerID="61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.083027 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjf2s\" (UniqueName: \"kubernetes.io/projected/656e9896-c532-43f3-bfa5-5009102e1cfa-kube-api-access-sjf2s\") on node \"crc\" DevicePath \"\"" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.107152 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/656e9896-c532-43f3-bfa5-5009102e1cfa-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "656e9896-c532-43f3-bfa5-5009102e1cfa" (UID: "656e9896-c532-43f3-bfa5-5009102e1cfa"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.139903 4794 scope.go:117] "RemoveContainer" containerID="059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44" Dec 15 14:31:20 crc kubenswrapper[4794]: E1215 14:31:20.140440 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44\": container with ID starting with 059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44 not found: ID does not exist" containerID="059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.140497 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44"} err="failed to get container status \"059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44\": rpc error: code = NotFound desc = could not find container \"059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44\": container with ID starting with 059d4e8780d263fb4377d8e8c73d32d3f500ad35f63023cb9b5d81107cc21a44 not found: ID does not exist" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.140529 4794 scope.go:117] "RemoveContainer" containerID="61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674" Dec 15 14:31:20 crc kubenswrapper[4794]: E1215 14:31:20.141554 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674\": container with ID starting with 61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674 not found: ID does not exist" containerID="61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.141619 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674"} err="failed to get container status \"61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674\": rpc error: code = NotFound desc = could not find container \"61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674\": container with ID starting with 61d8b1d6ed17b70a10af9ec4229e910b250c163a6dd9442fbadf15a84e67a674 not found: ID does not exist" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.184118 4794 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/656e9896-c532-43f3-bfa5-5009102e1cfa-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 15 14:31:20 crc kubenswrapper[4794]: I1215 14:31:20.749047 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="656e9896-c532-43f3-bfa5-5009102e1cfa" path="/var/lib/kubelet/pods/656e9896-c532-43f3-bfa5-5009102e1cfa/volumes" Dec 15 14:31:24 crc kubenswrapper[4794]: I1215 14:31:24.534673 4794 patch_prober.go:28] interesting pod/machine-config-daemon-fq2s6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 15 14:31:24 crc kubenswrapper[4794]: I1215 14:31:24.535211 4794 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 15 14:31:24 crc kubenswrapper[4794]: I1215 14:31:24.535251 4794 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" Dec 15 14:31:24 crc kubenswrapper[4794]: I1215 14:31:24.535825 4794 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e"} pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 15 14:31:24 crc kubenswrapper[4794]: I1215 14:31:24.535867 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerName="machine-config-daemon" containerID="cri-o://da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" gracePeriod=600 Dec 15 14:31:25 crc kubenswrapper[4794]: E1215 14:31:25.164863 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:31:26 crc kubenswrapper[4794]: I1215 14:31:26.075782 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:26 crc kubenswrapper[4794]: I1215 14:31:26.084126 4794 generic.go:334] "Generic (PLEG): container finished" podID="3538082f-5d54-4676-a488-7a3df6b9a1f4" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" exitCode=0 Dec 15 14:31:26 crc kubenswrapper[4794]: I1215 14:31:26.084181 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" event={"ID":"3538082f-5d54-4676-a488-7a3df6b9a1f4","Type":"ContainerDied","Data":"da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e"} Dec 15 14:31:26 crc kubenswrapper[4794]: I1215 14:31:26.084220 4794 scope.go:117] "RemoveContainer" containerID="37758683c163a83b27c0dcff30a182ab662ba60635fa8dc89b1899be6fa8e6c5" Dec 15 14:31:26 crc kubenswrapper[4794]: I1215 14:31:26.084762 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:31:26 crc kubenswrapper[4794]: E1215 14:31:26.085190 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:31:29 crc kubenswrapper[4794]: I1215 14:31:29.696011 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9kdp"] Dec 15 14:31:29 crc kubenswrapper[4794]: I1215 14:31:29.696752 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r9kdp" podUID="0df6de8b-b299-4699-ae2b-485024547ee7" containerName="registry-server" containerID="cri-o://2a07ca014cf53b4f0eeb9080d1a03ae9b9729cca8de11a898331927b947ec60c" gracePeriod=2 Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.122940 4794 generic.go:334] "Generic (PLEG): container finished" podID="0df6de8b-b299-4699-ae2b-485024547ee7" containerID="2a07ca014cf53b4f0eeb9080d1a03ae9b9729cca8de11a898331927b947ec60c" exitCode=0 Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.123009 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9kdp" event={"ID":"0df6de8b-b299-4699-ae2b-485024547ee7","Type":"ContainerDied","Data":"2a07ca014cf53b4f0eeb9080d1a03ae9b9729cca8de11a898331927b947ec60c"} Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.217859 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.256459 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-catalog-content\") pod \"0df6de8b-b299-4699-ae2b-485024547ee7\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.256501 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtkf7\" (UniqueName: \"kubernetes.io/projected/0df6de8b-b299-4699-ae2b-485024547ee7-kube-api-access-mtkf7\") pod \"0df6de8b-b299-4699-ae2b-485024547ee7\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.256664 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-utilities\") pod \"0df6de8b-b299-4699-ae2b-485024547ee7\" (UID: \"0df6de8b-b299-4699-ae2b-485024547ee7\") " Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.257617 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-utilities" (OuterVolumeSpecName: "utilities") pod "0df6de8b-b299-4699-ae2b-485024547ee7" (UID: "0df6de8b-b299-4699-ae2b-485024547ee7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.262026 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df6de8b-b299-4699-ae2b-485024547ee7-kube-api-access-mtkf7" (OuterVolumeSpecName: "kube-api-access-mtkf7") pod "0df6de8b-b299-4699-ae2b-485024547ee7" (UID: "0df6de8b-b299-4699-ae2b-485024547ee7"). InnerVolumeSpecName "kube-api-access-mtkf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.313617 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0df6de8b-b299-4699-ae2b-485024547ee7" (UID: "0df6de8b-b299-4699-ae2b-485024547ee7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.358635 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.358701 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtkf7\" (UniqueName: \"kubernetes.io/projected/0df6de8b-b299-4699-ae2b-485024547ee7-kube-api-access-mtkf7\") on node \"crc\" DevicePath \"\"" Dec 15 14:31:30 crc kubenswrapper[4794]: I1215 14:31:30.358719 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df6de8b-b299-4699-ae2b-485024547ee7-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:31:31 crc kubenswrapper[4794]: I1215 14:31:31.132159 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9kdp" event={"ID":"0df6de8b-b299-4699-ae2b-485024547ee7","Type":"ContainerDied","Data":"d5d595e28a7ac5c71ad4157d3b827e0798cdcb8e345c913a9b29827c74fd5cf4"} Dec 15 14:31:31 crc kubenswrapper[4794]: I1215 14:31:31.132479 4794 scope.go:117] "RemoveContainer" containerID="2a07ca014cf53b4f0eeb9080d1a03ae9b9729cca8de11a898331927b947ec60c" Dec 15 14:31:31 crc kubenswrapper[4794]: I1215 14:31:31.132201 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9kdp" Dec 15 14:31:31 crc kubenswrapper[4794]: I1215 14:31:31.153010 4794 scope.go:117] "RemoveContainer" containerID="8471bab609d3729266fdc883f2017d59be35cd3b27cc173c135ba7b098ff2165" Dec 15 14:31:31 crc kubenswrapper[4794]: I1215 14:31:31.165804 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9kdp"] Dec 15 14:31:31 crc kubenswrapper[4794]: I1215 14:31:31.174253 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r9kdp"] Dec 15 14:31:31 crc kubenswrapper[4794]: I1215 14:31:31.174565 4794 scope.go:117] "RemoveContainer" containerID="037babc52f9e9c8bbf66a7bb970aa0b041ffc3d9c162b37725aa52a767221cde" Dec 15 14:31:32 crc kubenswrapper[4794]: I1215 14:31:32.749246 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0df6de8b-b299-4699-ae2b-485024547ee7" path="/var/lib/kubelet/pods/0df6de8b-b299-4699-ae2b-485024547ee7/volumes" Dec 15 14:31:40 crc kubenswrapper[4794]: I1215 14:31:40.736831 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:31:40 crc kubenswrapper[4794]: E1215 14:31:40.737597 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:31:51 crc kubenswrapper[4794]: I1215 14:31:51.736904 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:31:51 crc kubenswrapper[4794]: E1215 14:31:51.737539 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:32:02 crc kubenswrapper[4794]: I1215 14:32:02.737955 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:32:02 crc kubenswrapper[4794]: E1215 14:32:02.738750 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:32:09 crc kubenswrapper[4794]: I1215 14:32:09.188327 4794 scope.go:117] "RemoveContainer" containerID="38ecefd4cee8659ac403a64c32995829abebe7e6d1637176048a08a5ed2e4cef" Dec 15 14:32:09 crc kubenswrapper[4794]: I1215 14:32:09.209832 4794 scope.go:117] "RemoveContainer" containerID="e478c7d48a88d7b58af836cdc28e6acb736f51f3717ed204157220df429d16cc" Dec 15 14:32:09 crc kubenswrapper[4794]: I1215 14:32:09.225647 4794 scope.go:117] "RemoveContainer" containerID="9678a2af744f197f5c759a672f274ca8c7cdf9b80a380d1068fd064d98e1b73e" Dec 15 14:32:09 crc kubenswrapper[4794]: I1215 14:32:09.264719 4794 scope.go:117] "RemoveContainer" containerID="0e2eee288141a28506b3f3ab29dbac9c8fcdc1c9eaa37d7f6105744c5ecaacd2" Dec 15 14:32:09 crc kubenswrapper[4794]: I1215 14:32:09.283225 4794 scope.go:117] "RemoveContainer" containerID="355aa55b93259d90dc3a07f7ca12198e64bc515f59c676928c2fbd482b87c604" Dec 15 14:32:09 crc kubenswrapper[4794]: I1215 14:32:09.305124 4794 scope.go:117] "RemoveContainer" containerID="a1bee7fda2eebabc5ef6d0cd694d718e9a66482f94298892df5ca0bdcd76d1b2" Dec 15 14:32:09 crc kubenswrapper[4794]: I1215 14:32:09.330940 4794 scope.go:117] "RemoveContainer" containerID="5bc0bf451dbb8e67f21c2d45ff97e849b96c3ac3bedd2b5bdebbcd6fd0d06181" Dec 15 14:32:09 crc kubenswrapper[4794]: I1215 14:32:09.374266 4794 scope.go:117] "RemoveContainer" containerID="50d80cfb70b0a1253db0b3343804d25e4c7547107837898eae340a1f3ca98baf" Dec 15 14:32:09 crc kubenswrapper[4794]: I1215 14:32:09.390319 4794 scope.go:117] "RemoveContainer" containerID="65847b0b92bda8651c4e95e0630bf0fc8e348ad0ffb19f3759c637bcfd746711" Dec 15 14:32:17 crc kubenswrapper[4794]: I1215 14:32:17.737004 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:32:17 crc kubenswrapper[4794]: E1215 14:32:17.737816 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:32:32 crc kubenswrapper[4794]: I1215 14:32:32.737300 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:32:32 crc kubenswrapper[4794]: E1215 14:32:32.738919 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:32:43 crc kubenswrapper[4794]: I1215 14:32:43.737486 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:32:43 crc kubenswrapper[4794]: E1215 14:32:43.738326 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:32:57 crc kubenswrapper[4794]: I1215 14:32:57.737363 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:32:57 crc kubenswrapper[4794]: E1215 14:32:57.738124 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:33:09 crc kubenswrapper[4794]: I1215 14:33:09.535143 4794 scope.go:117] "RemoveContainer" containerID="035d3c93a9e36bf323cc782fbdcf42b562074b46ccac5e77a88906f3d1d84cb9" Dec 15 14:33:09 crc kubenswrapper[4794]: I1215 14:33:09.738412 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:33:09 crc kubenswrapper[4794]: E1215 14:33:09.738667 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.508187 4794 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v4fxt"] Dec 15 14:33:11 crc kubenswrapper[4794]: E1215 14:33:11.509646 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="656e9896-c532-43f3-bfa5-5009102e1cfa" containerName="gather" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.509728 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="656e9896-c532-43f3-bfa5-5009102e1cfa" containerName="gather" Dec 15 14:33:11 crc kubenswrapper[4794]: E1215 14:33:11.509809 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df6de8b-b299-4699-ae2b-485024547ee7" containerName="extract-content" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.509930 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df6de8b-b299-4699-ae2b-485024547ee7" containerName="extract-content" Dec 15 14:33:11 crc kubenswrapper[4794]: E1215 14:33:11.510077 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df6de8b-b299-4699-ae2b-485024547ee7" containerName="registry-server" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.510166 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df6de8b-b299-4699-ae2b-485024547ee7" containerName="registry-server" Dec 15 14:33:11 crc kubenswrapper[4794]: E1215 14:33:11.510243 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="656e9896-c532-43f3-bfa5-5009102e1cfa" containerName="copy" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.510301 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="656e9896-c532-43f3-bfa5-5009102e1cfa" containerName="copy" Dec 15 14:33:11 crc kubenswrapper[4794]: E1215 14:33:11.510364 4794 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df6de8b-b299-4699-ae2b-485024547ee7" containerName="extract-utilities" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.510424 4794 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df6de8b-b299-4699-ae2b-485024547ee7" containerName="extract-utilities" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.510644 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="656e9896-c532-43f3-bfa5-5009102e1cfa" containerName="copy" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.510753 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="0df6de8b-b299-4699-ae2b-485024547ee7" containerName="registry-server" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.510819 4794 memory_manager.go:354] "RemoveStaleState removing state" podUID="656e9896-c532-43f3-bfa5-5009102e1cfa" containerName="gather" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.512085 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.530526 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v4fxt"] Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.673636 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gttqf\" (UniqueName: \"kubernetes.io/projected/05f9d15a-7997-4c3f-9e98-654d653304c8-kube-api-access-gttqf\") pod \"certified-operators-v4fxt\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.673720 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-catalog-content\") pod \"certified-operators-v4fxt\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.673761 4794 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-utilities\") pod \"certified-operators-v4fxt\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.775444 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-utilities\") pod \"certified-operators-v4fxt\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.775541 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gttqf\" (UniqueName: \"kubernetes.io/projected/05f9d15a-7997-4c3f-9e98-654d653304c8-kube-api-access-gttqf\") pod \"certified-operators-v4fxt\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.775605 4794 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-catalog-content\") pod \"certified-operators-v4fxt\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.776057 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-utilities\") pod \"certified-operators-v4fxt\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.776359 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-catalog-content\") pod \"certified-operators-v4fxt\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.795886 4794 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gttqf\" (UniqueName: \"kubernetes.io/projected/05f9d15a-7997-4c3f-9e98-654d653304c8-kube-api-access-gttqf\") pod \"certified-operators-v4fxt\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:11 crc kubenswrapper[4794]: I1215 14:33:11.837106 4794 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:12 crc kubenswrapper[4794]: I1215 14:33:12.437522 4794 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v4fxt"] Dec 15 14:33:13 crc kubenswrapper[4794]: I1215 14:33:13.094437 4794 generic.go:334] "Generic (PLEG): container finished" podID="05f9d15a-7997-4c3f-9e98-654d653304c8" containerID="2e14fed605d1abb96b1a957aca53c499e0feb65ed556ebed73b45df0edd1046e" exitCode=0 Dec 15 14:33:13 crc kubenswrapper[4794]: I1215 14:33:13.094499 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4fxt" event={"ID":"05f9d15a-7997-4c3f-9e98-654d653304c8","Type":"ContainerDied","Data":"2e14fed605d1abb96b1a957aca53c499e0feb65ed556ebed73b45df0edd1046e"} Dec 15 14:33:13 crc kubenswrapper[4794]: I1215 14:33:13.094892 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4fxt" event={"ID":"05f9d15a-7997-4c3f-9e98-654d653304c8","Type":"ContainerStarted","Data":"78c5d5b09cac93525cccafd16a48fb67ea8aa2b6abc3e4bbca42d9d1af3b9d92"} Dec 15 14:33:13 crc kubenswrapper[4794]: I1215 14:33:13.097824 4794 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 15 14:33:15 crc kubenswrapper[4794]: I1215 14:33:15.109923 4794 generic.go:334] "Generic (PLEG): container finished" podID="05f9d15a-7997-4c3f-9e98-654d653304c8" containerID="ff84d78e97287d112f7259de093cbfea9c555cb9566c7457ddfbefe852f8e149" exitCode=0 Dec 15 14:33:15 crc kubenswrapper[4794]: I1215 14:33:15.109966 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4fxt" event={"ID":"05f9d15a-7997-4c3f-9e98-654d653304c8","Type":"ContainerDied","Data":"ff84d78e97287d112f7259de093cbfea9c555cb9566c7457ddfbefe852f8e149"} Dec 15 14:33:17 crc kubenswrapper[4794]: I1215 14:33:17.141328 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4fxt" event={"ID":"05f9d15a-7997-4c3f-9e98-654d653304c8","Type":"ContainerStarted","Data":"5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3"} Dec 15 14:33:17 crc kubenswrapper[4794]: I1215 14:33:17.164427 4794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v4fxt" podStartSLOduration=3.067933664 podStartE2EDuration="6.164409702s" podCreationTimestamp="2025-12-15 14:33:11 +0000 UTC" firstStartedPulling="2025-12-15 14:33:13.097446638 +0000 UTC m=+2354.949469086" lastFinishedPulling="2025-12-15 14:33:16.193922696 +0000 UTC m=+2358.045945124" observedRunningTime="2025-12-15 14:33:17.159230795 +0000 UTC m=+2359.011253233" watchObservedRunningTime="2025-12-15 14:33:17.164409702 +0000 UTC m=+2359.016432140" Dec 15 14:33:21 crc kubenswrapper[4794]: I1215 14:33:21.837259 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:21 crc kubenswrapper[4794]: I1215 14:33:21.837525 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:21 crc kubenswrapper[4794]: I1215 14:33:21.894931 4794 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:22 crc kubenswrapper[4794]: I1215 14:33:22.229102 4794 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:24 crc kubenswrapper[4794]: I1215 14:33:24.738168 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:33:24 crc kubenswrapper[4794]: E1215 14:33:24.738657 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:33:25 crc kubenswrapper[4794]: I1215 14:33:25.494505 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v4fxt"] Dec 15 14:33:25 crc kubenswrapper[4794]: I1215 14:33:25.495170 4794 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v4fxt" podUID="05f9d15a-7997-4c3f-9e98-654d653304c8" containerName="registry-server" containerID="cri-o://5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3" gracePeriod=2 Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.054305 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.131537 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-utilities\") pod \"05f9d15a-7997-4c3f-9e98-654d653304c8\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.131673 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gttqf\" (UniqueName: \"kubernetes.io/projected/05f9d15a-7997-4c3f-9e98-654d653304c8-kube-api-access-gttqf\") pod \"05f9d15a-7997-4c3f-9e98-654d653304c8\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.131776 4794 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-catalog-content\") pod \"05f9d15a-7997-4c3f-9e98-654d653304c8\" (UID: \"05f9d15a-7997-4c3f-9e98-654d653304c8\") " Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.132471 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-utilities" (OuterVolumeSpecName: "utilities") pod "05f9d15a-7997-4c3f-9e98-654d653304c8" (UID: "05f9d15a-7997-4c3f-9e98-654d653304c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.149868 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05f9d15a-7997-4c3f-9e98-654d653304c8-kube-api-access-gttqf" (OuterVolumeSpecName: "kube-api-access-gttqf") pod "05f9d15a-7997-4c3f-9e98-654d653304c8" (UID: "05f9d15a-7997-4c3f-9e98-654d653304c8"). InnerVolumeSpecName "kube-api-access-gttqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.194448 4794 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05f9d15a-7997-4c3f-9e98-654d653304c8" (UID: "05f9d15a-7997-4c3f-9e98-654d653304c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.221882 4794 generic.go:334] "Generic (PLEG): container finished" podID="05f9d15a-7997-4c3f-9e98-654d653304c8" containerID="5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3" exitCode=0 Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.221937 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4fxt" event={"ID":"05f9d15a-7997-4c3f-9e98-654d653304c8","Type":"ContainerDied","Data":"5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3"} Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.222361 4794 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4fxt" event={"ID":"05f9d15a-7997-4c3f-9e98-654d653304c8","Type":"ContainerDied","Data":"78c5d5b09cac93525cccafd16a48fb67ea8aa2b6abc3e4bbca42d9d1af3b9d92"} Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.222393 4794 scope.go:117] "RemoveContainer" containerID="5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.221974 4794 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4fxt" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.233926 4794 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-utilities\") on node \"crc\" DevicePath \"\"" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.233958 4794 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gttqf\" (UniqueName: \"kubernetes.io/projected/05f9d15a-7997-4c3f-9e98-654d653304c8-kube-api-access-gttqf\") on node \"crc\" DevicePath \"\"" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.233970 4794 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05f9d15a-7997-4c3f-9e98-654d653304c8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.248165 4794 scope.go:117] "RemoveContainer" containerID="ff84d78e97287d112f7259de093cbfea9c555cb9566c7457ddfbefe852f8e149" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.257685 4794 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v4fxt"] Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.265517 4794 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v4fxt"] Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.283841 4794 scope.go:117] "RemoveContainer" containerID="2e14fed605d1abb96b1a957aca53c499e0feb65ed556ebed73b45df0edd1046e" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.313169 4794 scope.go:117] "RemoveContainer" containerID="5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3" Dec 15 14:33:27 crc kubenswrapper[4794]: E1215 14:33:27.313698 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3\": container with ID starting with 5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3 not found: ID does not exist" containerID="5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.313743 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3"} err="failed to get container status \"5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3\": rpc error: code = NotFound desc = could not find container \"5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3\": container with ID starting with 5b8c107f805fcd8a8b89b47a074c13e67894cc17e86924a582d389b50520bdd3 not found: ID does not exist" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.313773 4794 scope.go:117] "RemoveContainer" containerID="ff84d78e97287d112f7259de093cbfea9c555cb9566c7457ddfbefe852f8e149" Dec 15 14:33:27 crc kubenswrapper[4794]: E1215 14:33:27.314152 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff84d78e97287d112f7259de093cbfea9c555cb9566c7457ddfbefe852f8e149\": container with ID starting with ff84d78e97287d112f7259de093cbfea9c555cb9566c7457ddfbefe852f8e149 not found: ID does not exist" containerID="ff84d78e97287d112f7259de093cbfea9c555cb9566c7457ddfbefe852f8e149" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.314178 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff84d78e97287d112f7259de093cbfea9c555cb9566c7457ddfbefe852f8e149"} err="failed to get container status \"ff84d78e97287d112f7259de093cbfea9c555cb9566c7457ddfbefe852f8e149\": rpc error: code = NotFound desc = could not find container \"ff84d78e97287d112f7259de093cbfea9c555cb9566c7457ddfbefe852f8e149\": container with ID starting with ff84d78e97287d112f7259de093cbfea9c555cb9566c7457ddfbefe852f8e149 not found: ID does not exist" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.314192 4794 scope.go:117] "RemoveContainer" containerID="2e14fed605d1abb96b1a957aca53c499e0feb65ed556ebed73b45df0edd1046e" Dec 15 14:33:27 crc kubenswrapper[4794]: E1215 14:33:27.314459 4794 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e14fed605d1abb96b1a957aca53c499e0feb65ed556ebed73b45df0edd1046e\": container with ID starting with 2e14fed605d1abb96b1a957aca53c499e0feb65ed556ebed73b45df0edd1046e not found: ID does not exist" containerID="2e14fed605d1abb96b1a957aca53c499e0feb65ed556ebed73b45df0edd1046e" Dec 15 14:33:27 crc kubenswrapper[4794]: I1215 14:33:27.314492 4794 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e14fed605d1abb96b1a957aca53c499e0feb65ed556ebed73b45df0edd1046e"} err="failed to get container status \"2e14fed605d1abb96b1a957aca53c499e0feb65ed556ebed73b45df0edd1046e\": rpc error: code = NotFound desc = could not find container \"2e14fed605d1abb96b1a957aca53c499e0feb65ed556ebed73b45df0edd1046e\": container with ID starting with 2e14fed605d1abb96b1a957aca53c499e0feb65ed556ebed73b45df0edd1046e not found: ID does not exist" Dec 15 14:33:28 crc kubenswrapper[4794]: I1215 14:33:28.748397 4794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05f9d15a-7997-4c3f-9e98-654d653304c8" path="/var/lib/kubelet/pods/05f9d15a-7997-4c3f-9e98-654d653304c8/volumes" Dec 15 14:33:39 crc kubenswrapper[4794]: I1215 14:33:39.737674 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:33:39 crc kubenswrapper[4794]: E1215 14:33:39.738507 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:33:53 crc kubenswrapper[4794]: I1215 14:33:53.737390 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:33:53 crc kubenswrapper[4794]: E1215 14:33:53.738052 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:34:04 crc kubenswrapper[4794]: I1215 14:34:04.737983 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:34:04 crc kubenswrapper[4794]: E1215 14:34:04.739068 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:34:15 crc kubenswrapper[4794]: I1215 14:34:15.737662 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:34:15 crc kubenswrapper[4794]: E1215 14:34:15.738387 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:34:30 crc kubenswrapper[4794]: I1215 14:34:30.737771 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:34:30 crc kubenswrapper[4794]: E1215 14:34:30.738473 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4" Dec 15 14:34:43 crc kubenswrapper[4794]: I1215 14:34:43.737381 4794 scope.go:117] "RemoveContainer" containerID="da547058e4c496b874ccce2e5fdd82d3d883195de29d2e205c8c4dd1e6adda7e" Dec 15 14:34:43 crc kubenswrapper[4794]: E1215 14:34:43.738120 4794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fq2s6_openshift-machine-config-operator(3538082f-5d54-4676-a488-7a3df6b9a1f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-fq2s6" podUID="3538082f-5d54-4676-a488-7a3df6b9a1f4"